Machine generated text as a way to respond to conversational, human-provided text input has been an application of computing since at least Eliza, but AI Dungeon 2
takes it to new extremes.
It basically pretends to be one of those old Infocom-style adventure games, but with no limits on what the player can enter. Whatever the input is, it improvises events through prose, to react to them.
While it’s still not up to the level of classic or modern text adventures, and occasionally loses track of situations over multiple turns, AI Dungeon 2 sometimes surprises in the kinds of responses it can generate, and the unexpected directions it can take the user’s adventures.
The game was created by Nick Walton of Brigham Young University’s Perception, Control, and Cognition Laboratory and created quite a stir upon its unveiling, so much that they had to quickly rework it to cut down on bandwidth costs. Walton answered some questions we had about how AI Dungeon 2 was made and where it’s going next.
Who are you, what is AI Dungeon 2, and who helped make it a reality?
I've been working on deep learning tech for the past couple years. I interned at a couple autonomous vehicle companies and have been doing research in the BYU Perception, Control, and Cognition Laboratory, a deep learning research lab at BYU.
I've been working on AI Dungeon since I had the idea at a hackathon in March. I continued to work on the idea, getting advice from Dr. David Wingate, the professor I worked with. Eventually it finally came together as AI Dungeon 2 at the beginning of December.
AI Dungeon is a first of its kind game where the story and responses to your actions are entirely generated by deep learning. The awesome thing about this is the game is built on top of a giant machine learning model made by OpenAI called GPT-2 it can adapt and respond to almost any action you can imagine, giving you vast amounts of freedom that players have never had in a game before.
AI Dungeon 2's popularity shot up suddenly, almost overnight. What happened?
I had gotten some attention with AI Dungeon 1. A couple thousand people had played it when I had released it back in May, so when I started teasing snippets from AI Dungeon 2 people started paying attention. But when I actually released AI Dungeon 2 it exploded much faster than I had expected.
We had released it as a game people could play in Google Colab, a way for researchers to run code on Google machines for free, but it would download the 5 GB model each time someone played.
Within a few days we had racked up over $20,000 in bandwidth charges from so many people downloading the model. We had to temporarily shut it down until some awesome community contributors put up a peer to peer downloading solution.
Now AI Dungeon 2 is playable as a web and mobile app and a team has come together to work on improving the game. An awesome app developer Braydon Batungbacal built out the mobile apps in a week, my brother Alan Walton helped us build out the infrastructure to host the massive model at a large scale, and Thorsten Kreutz is helping us work on the long term of where this idea can go. We see AI Dungeon 2 as just the beginning of an exciting shift in entertainment.
The format of AI Dungeon 2's output appears to emulate that of the classic Infocom text adventures. AI Dungeon gets its text corpus from Choose Your Story, however. Was there any form of data massage needed to produce usable input and/or output?
Yeah there is a decent amount of model and data massaging that happens on the game side to make sure the model plays nicely. We cut off trailing sentences and end of text tokens in the model output as well as do some transforming on the player's input to make sure it's in the second person format. I also had to do a decent amount of work to control repetition as that's a big problem with GPT-2 especially at lower temperature (randomness) settings.
It is a fun hobbyist programmer project to make your own text generator. GPT-2, while a great deal more sophisticated and using a much larger set of data, seems to do a similar kind of thing. Could you give us an overview of how it works?
GPT-2 is a deep learning language model. A language model essentially just means it tries to predict the most likely next word given the previous set of words. The really cool thing about language models is because you can train it on any text (no labeling required) it can learn from the really massive amount of text data that's out there. The largest model was trained on 40 GB of just text data.
In learning to predict the most likely next word, it learns not just how the English language works, but it also learns a model of the world that allows it to predict the most likely next event (for example if you get stabbed you might die) and that allows you to use it to do some really interesting things.
The GPT-2 system is available for download for people who'd like to play around with it. How would you recommend interested people might get started?
There's some really great Python packages that let you fine tune GPT-2 and play around with it yourself. GPT-2-Simple by Max Woolf is probably the easiest way to get started and you can run it in a Google Colab notebook like the one here
The site for AI Dungeon 2 notes that you're beginning to sell mobile app interfaces to help fund the server and bandwidth that the game requires. How is that going? Are you still worried that the project may have to end due to financial concerns?
We've turned the game into a startup as there's really no way to make a great experience without having the financial sustainability of a business. We plan on keeping the base version of the game free to play, but will add a premium version of the game that will allow us to cover the server costs.
The premium version will have a lot of interesting new game modes and features so we're pretty confident we'll be able to make the game sustainable with that.
Is AI Dungeon 2, as it stands, complete? Are there any improvements, or a sequel, planned? Maybe allowing for more starting points, or to provide greater continuity between turns?
We have a lot of different things we're working on. In the near term, we're working on adding multiplayer and text to voice support, but we're also looking at more innovative improvements in the future.
A text generator like this seems like it might have huge and hilarious applications beyond just simulating a text adventure game! Google is trying to get people used to thinking of AIs in non-Skynet terms with their free course, and researcher Janelle Shane has put an AI of their own creation to work on such subjects as naming kittens. Do you have plans to continue with text generation, or are you moving on to other applications?
Text generation will continue to be a big part of what we're trying to make with AI Dungeon and its sequels, but we'll also start adding in other forms of AI content generation with music, images, and voice as we're able to.
Most of our readers are game developers, and some of those use bespoke algorithms to do things like randomly generate terrain to explore. GPT-2 is built to generate text based on other text found on the internet; do you think the underlying system might someday be of use in the production of more traditional types of entertainment software?
Definitely. I think AI Dungeon is just the beginning in a huge evolution in gaming and we're hoping to help enable that. There's a lot of exciting potentials this technology opens up and we're excited to see what that future looks like.