|
From presenting enemies that can attack more convincingly to the reflexive quest structure of Skyrim, there are lots of ways that game developers can make use of artificial intelligence. Michael Cook's research as part of Imperial College London's Computational Creativity Group is concerned with just one very simple application, however.
That is: can an AI design a game all by itself?
While it's a question that might sound outlandish to many though, Michael's research has already proved that such a thing is at least possible on a limited scale. Earlier this year his ANGELINA AI demonstrated platform games it had built based on newspaper articles and, while the games were certainly crude in construction, they functioned excellently as a proof of concept.
Since then, Michael's been bulking ANGELINA out with extensions such as his new Mechanic Miner sub-system -- a feature which finds new possible mechanics in properly presented engines and can then design levels to suit.
Despite all the progress made though, ANGELINA has yet to light the fire you'd expect among game developers. Aside from a few articles with provocative headlines playing off the idea of human redundancy, ANGELINA has mostly been received with ambivalence - a problem that other researchers in the field have experienced all too often.
"This is one of the problems with computational creativity," says Michael of the muted reaction so far. "Whenever you suggest a piece of software doing work independently and autonomously then professionals don't like it... [But] it's not that people are aggressive -- they just don't think that it could ever reach a decent level of quality."
"I've got colleagues who've had five page emails from professional artists outlining why they're obviously going to fail -- why they should quit their jobs and so on."
 A Puzzling Present is ANGELINA’s Christmas themed showcase, created as a 'collaboration' with Michael.
To focus on the question of quality is to miss the point, however. The question the Computational Creativity Group is hoping to answer isn't whether computers can design triple-A games that will render big studios redundant; it's whether creativity is a uniquely organic experience.
Can an AI experience a flash of inspiration, or is that something reserved only for humans?
"I don't think there's any difference between the human mind and a computer," says Michael, though he admits a big problem is that we still know relatively little about how the human mind works. Instead, he thinks that humans tend to romanticize their own thoughts - and he argues it's that which has caused much of the industry's indifference so far.
"This is something that comes up a lot when I explain how ANGELINA uses evolutionary systems for level design, for example," says Michael. When he explains it to people they reject it because it appears like a formalized system, not a creative process.
Broadly speaking, evolutionary systems function pretty much as you'd expect from the name; a random population of objects is assessed according to set parameters, with 'fit' instances then combined and re-assessed until a minimum quality level is attained. When ANGELINA uses such a system to create levels then it does so by creating a random population of 2D levels, then assesses them according to criteria such as 'Is Solvable' and 'Player Must Use Mechanic To Solve'. Passable results are then combined and the whole process loops again until the system is satisfied with the result.
"Because it starts with randomness [researchers] really see evolutionary systems as a good thing," explains Michael. "It's ANGELINA going through 300 random ideas, identifying the good ones and iterating forward. I don't know about you, but that's how I try to solve problems too; I'll sit and doodle."
"Good ideas aren't intentionally produced; you just recognize that one of the random thoughts you were having was good... But nobody thinks of themselves that way. They think of themselves as incredibly creative. Artists will say that their muse descended on them and so on."
 A Puzzling Present features levels designed by ANGELINA, with Michael acting as a curator.
It's this sort of misunderstanding that Michael is determined to resolve. Currently he's working with local indie developers such as Alan Hazelden to see if ANGELINA can be imbued with more human-seeming attributes. Is it possible to give ANGELINA a sense of style, for example, that will make the levels it designs recognizable when compared to others? Such an approach will hopefully sift out some of the objectionable randomness, though Michael's aware that the artificiality of it may open ANGELINA to further criticism.
There's the thorny issue of creative repeatability, which has so far proved impossible to resolve. What does it say about ANGELINA that the creativity it displays can be replicated at the press of a button?
"Take The Binding of Isaac, for example. Edmund McMillen won't make another game about his experience of religion as a child and if he did then I imagine he wouldn't make another game like that," explains Michael. "But ANGELINA isn't like that; I can effectively travel back in time by faking the inputs and using snapshots of the sources."
So, even if ANGELINA were able to express a unique style then repeatability would mean that it may still seem to be the product of a system, rather than a creative process. Again, it's an issue that arises from the high opinion we have of ourselves; we think our creativity is unique and therefore valuable, while ANGELINA's is repeatable and therefore worthless; not creative. Michael insists it's an unfair assumption to start out from given that it's impossible to clarify whether our creativity is as rarefied as we think.
"People say that if I re-ran ANGELINA then I'd get the same game again, but I can't prove if that's true or not with humans because I can't go back in time," Michael points out with a sigh. "But people can always go back to these arguments... there are so many Get Out Of Jail Free Cards to oppose this sort of thing that I feel, at some point, we'll have to say we can't engage the arguments because they don't make sense."
"Some artists bring up the nature of the soul; something that is innately human, for example. That's something I feel you may as well disregard until such a point as computers are no longer synthetic," Michael continues. "Because really what you're saying there is that computers are machines and humans aren't."
|
A major point that was left out of the article was that Michael had to teach the AI what game design is by defining rules and expectations. This in itself is Michael imprinting his thoughts on what game design is. At this point, the AI becomes a tool for helping Michael explore possibilities that he didn't have time to test. The ability of the AI is limited by the fitness function that Michael has created for it, while the computer is just following instructions.
More: http://ironrebelstudios.wordpress.com/2013/04/01/ai-designers/
You're absolutely right - ANGELINA is constrained by the fitness functions I write for her, and that certainly makes the games somehow left with my shadow on them. But this isn't something I'm happy with - and it's not something we'll be leaving as-is. Some creative systems have already tried to go beyond this, by inventing their own aesthetic measures for the things they create. We're definitely working on this issue!
Also - ANGELINA doesn't combine all possible ideas. It started with a random grab. The set of all possible ideas, even within ANGELINA's limited space, is still too vast! Evolutionary approaches tend to begin with a small subset, but a randomly selected one.
But practically speaking, AI's seem to clearly be on track to master the formal aesthetic attributes of all sorts of art forms, but a truly successful artwork is not only formally successful but imbued with expressive intent. You see something in the world and you say, "I have to say something about this." Or perhaps at first the idea just comes to you but then as you work on it you start to see how X is like a metaphor for Y and subtextually the whole thing could actually be about Z and then that starts informing your more basic aesthetic decisions. Sometimes you go back and rework the earlier stuff because when you were making it you thought the work was about A, but now it seems like its more about B. And then, even if an artist does all of that successfully, in order to be really successful you have to be in touch with the state of culture - the "zeitgeist", as it were. I don't think any artist is able to be in that position on purpose, and yet certain artists seem to hit that sweet spot again and again so it doesn't seem to be random either. It's like certain people are just really in tune with the dominant cultural trends.
How could one even begin simulating such things? It seems to me that you'd need self aware machines first...
I'm not sure ALL art requires intent (or at least, not the sort of romantic intent you describe) but certainly much of it does make use of it. Certainly, no artist makes a piece of art by accident! Intent is a big hurdle in computational creativity, but people are making attempts to get around it. I'm still not sure where I stand on the matter. I think intent may be part of that human element that creative works have that AI do not need to replicate. But certainly my supervisor (Simon Colton) is very interested in the notion!
I can also see the benefit in allowing Angelina to explore a wide range of ideas without really being concerned if it's a good design. That sort of broad stroke lateral thinking isn't always easy for people. Because if Angelina is to be a tool along side others, then someone will come along and develop that idea anyway (a lot like how we do it now).
I realize my thoughts may have come off as discrediting you or your work, which is not my intention. I think it's fantastic that you're taking on this large challenge and wish you well. I've experimented and seen the work of neural networks first hand and am not convinced of conscious thought deriving from it alone, which prompted my response. I'm curious to see how your project evolves because if you solve this, it would feel like you invented a chainsaw to sharpen a pencil when there is a forest waiting for such a tool.
On a side note, who takes credit or blame for a game that is created by an AI? The AI or the creator?
If you just want a program that solves a particular problem, then a top-down, heuristics-plus-brute-force program is a reasonable approach. But general digital creativity... per Hofstadter, we'll only see that when digital [sentiences] have grown, bottom-up, much like people do.
That doesn't mean the work with ANGELINA is worthless. The process of trying to take short cuts to creativity still reveals useful insights about, if nothing else, what *doesn't* seem truly creative. As a short-term, baby-steps process, I'd like to see the work on systems like ANGELINA continue.
But I do suspect that the real breakthroughs will be the products of longer-term work on self-organizing complexity.
I think by working on focused, generalisable systems - like Mechanic Miner - we can actually make systems that can make a difference right now. General digital creativity is a great goal, of course. But for me, I'm interested in how we can push that top down in novel and creative ways. I love the search for innovative shortcuts, even though the allure of true intelligence and creativity is incredibly strong.
Thanks for reading the piece and commenting!
I do think bottom-up "growing" of digital minds will ultimately prove to be the only way we'll see minds capable of creativity rivaling or surpassing our own human variety. All of GEB (if I may say so) seems to be a defense and explanation for that viewpoint, albeit in the most incredibly entertaining way I've ever seen. (Which is part of the point that Hofstadter is making.)
But that's a long way off, and I'm convinced that more direct efforts like ANGELINA are necessary steps for figuring out how to get there. We learn by trying. Best of luck to you!
While a procedurally-generated dungeon crawler may spontaneously create a +5/+3 Sword and hide it in a chest behind a Hydra (ie. taking existing mechanics and tweaking values), something like ANGELINA would be inventing new mechanics (chests that teleport you, swords that drain your health if you don't use it often enough, etc.)
In short, those things are robotic level designers. This is a robotic game designer (which, out of necessity, also functions as a level designer for its game).
What we're trying to do with ANGELINA is to pull away from that end of the spectrum. To introduce more unpredictability, to cover a large space of solutions so that unexpected innovation can be found. We want ANGELINA to exert aesthetics and opinions on the content it creates - and, over time, to create those aesthetics and opinions itself.
Very excited to see this piece online. I'm travelling to a conference tomorrow (to talk about Mechanic Miner, no less!) but I will try and respond to comments and questions as soon as I can.
I just wanted to comment again to say I am an idiot - I knew about you and your City Conquest work! It's terrific stuff, that I talk about a lot, because it's a great example of how evolution (and AI in general) can be leveraged in new and unusual ways that can really matter to the industry. This is where I want to see AI being used. AI is not about guiding people around maps any more. It is a wonderful way of manipulating programs and data to do clever, new things. City Conquest's use of it is just... great. I've been at EvoStar this week in Vienna, and the work has come up many times with many different people. Great stuff!
A little more on note: I think this kind of research is awesome and it's a good thing the researchers themselves hold their work to a very realistic standard. I find the idea that a computer could, alone, design new game systems today to be outrageous, but tomorrow is a brand new world and it's research like this that will spark the creativity of millions of people in the future that will make just that happen.
The concept of breaking this down into a process that a computer can perform may not create the same outcome or results of providing this information to a human, but does it have to? It seems like as we define the ideas and teach them to others we can also write them down in a form that would allow technology to iterate on it, in its own way.
While I don't feel that we will have an AI creating games the same as humans, I don't necessarily think they need to, why would we want something to replicate? don't we want something to come up with something unique that we didn't think of? The goal here seems to be something new, so I don't think we should be focused too much on it doing what we do, but on doing what it does, and seeing where that takes it. That interests me greatly.
But as for true experience, emotional connection - no. Computers won't experience the world exactly as I do. But perhaps that can be used as a strength. We should seek to see what computers can do despite their limitations, and perhaps discover a different way in which creativity can be expressed or understood! I hope so, at least.
Of course your comments on iterative design are also great points. We definitely want AI to play to those strengths too.
Interesting project.
I think two relatively important features of an AI like this would be:
1) The ability of the AI to play the game to receive data on the simplicity/complexity of the level (also to determine if it can be completed)
2) Emotions that change based on several parameters that change far slower than other variables.
The first part allows the computer to gauge how hard a level is, the second allow the computer to get frustrated/excited with it's own work.
Frustration would essentially make the program start recursively cutting pieces of the level/design until the AI is content again [hopefully it doesn't scrap the entire thing ;) ].
Excitement would make the AI try similar strategies in the design to explore it's current field of study (until it stops working), then it goes back to being either content or frustrated.
Easier said than done (although it would be interesting to watch said AI play it's own creation).
As for 2) - I do like the idea of qualitative responses from creative software. I'm not sure how to carry it out yet. Some people don't like the idea of human mimicry - after all, an AI can't experience frustration in exactly the same way as me. But perhaps we can think of emotions in a different way for AI.
I like this idea too. There's an interesting set of work going on in mixed-initiative tools where designers collaborate with creative (or generative) software to express themselves. I hope this can move into more complex spaces, like mechanical or thematic ones, for game design soon.
The question isn't "how can we get computers to design games for us?" but "how can we use AI to enhance designers' productivity?" What we need isn't so much an "auto-design" program but a virtual wind tunnel for game design that can help us quickly identify how any incremental design change will modify the aerodynamics of the user experience.
But yes. Of course the question of 'enjoyment' is very difficult, and may be a problem that transcends AI methodologies. Where do we find culture? common sense? humanity? There are too many answers to this. For me, I want elegance and simplicity even if this comes at the expense of beauty and depth. I want to hack the Internet to give me the raw truth about human culture.
Unlike the many other people who view computers designing games as an inherently limited and perhaps futile attempt, I'm thoroughly supportive and fascinated by what should be better considered an attempt to characterize and understand human creativity. Much in the same way many seemingly complex animal behaviors decompose into surprisingly simplistic rules, there's an interesting possibility where your work might unearth a path to programmatically producing "fun." Which is half the battle in understanding it. Phenomena such as termites creating piles, birds flying in a flock, or crowds passing each other in opposite directions boil down to a few easy logical conditions, and what everyone insists is an inability to be creative seems to me to be a way to discover clear, consistent rules to improve game design.
I won't pretend there isn't some mystique in the process through which humans create, or that to get a computer to simulate the erratic and irrational way our minds work might be a less likely accomplishment. But as an intellectual exercise, you're pulling back the curtain on the dark art of game design, in some sense, and I applaud that!
Of course, I'm a computer scientist, and as such my view of what's creative and what's not might be skewed as a result. Brilliant ideas are often the result of trying ideas outside of the defined search space, but I'd say as humans we classify a lot of things as creative that willingly demonstrate thinking inside the box.
Best of luck!
Jonathan
Like you, I find mystique in the process to a certain extent too. There are mindsets you get into - when I'm working as a scientist, I think in cold terms, but when I sit back and read Rock, Paper, Shotgun (or Gamasutra! Haha) I get excited and sit in wonderment. Humans are sitll so far above computers that a really great idea will still grab me and make me think "Wow. Someone thought of that. Humans are incredible."
Hopefully even through understanding this, we will still see magic in it. We understand how people run fast, but watching a 100m sprinter still fills us with awe.