I just came across a great study out of MIT about how game manuals get used – but not by people! Professor Barzilay and her team created a computer program that learned to play better by reading the manual.
This is potentially a very big deal. Reading – or, as the artificial intelligence folks call it, natural language processing – is one of those problems that theorists thought would be solved fifty years ago, but turned out to be way tougher than expected. Because there are so many ways to express any given concept, it’s extremely difficult to translate text into the underlying logical concepts that a computer can understand.
As part of her research, Barzilay developed a program that could play the game Civilization. The program didn’t know much about the game, but it could move the cursor, see what was on the screen, and tell whether it had won or lost the game. Over time, the program learned to play the game more and more effectively – winning up to 62% of its games. But next, Barzilay decided to let the program examine the text of the game’s manual. It didn’t know what the words meant, but it could figure out what words appeared in the game and compare them to words in the manual, then take relevant actions. Using this strategy, the program’s win rate shot up to 79%.
As the MIT article points out, this research shows a possible new direction for natural language research. But what excites me about this is how strangely similar it is to the way that people use game manuals.
Jim Gee writes about game manuals and “situated meaning.” He argues that for most people*, the words in the manual are just words. We don’t easily translate them into the actions we’re supposed to take while playing. Instead, we play the game, and then use the manual to help us understand what our actions mean. Reading about, say, the V.A.T.S. in Fallout 3 is very different from using it, then looking it up in the manual for additional insight. In the latter case, we already have a sense of what the V.A.T.S. does** because we have experienced it, and we know how our use of it may or may not serve our larger goals in the game. All of a sudden we have a specific and meaningful context in which to make sense of the words in the manual – which, otherwise, are just words.
Barzilay is essentially taking Gee’s insight and putting it into practice. She lets the computer make sense out of text in the context of play. Did this insight from the manual help me win the game? Which pieces of the manual are related to the in-game actions I’m taking now? These are questions Barzilay’s program asks itself – but they’re questions we human beings ask ourselves, too.
Neat.
* I think we all know someone who insists on reading the whole manual before starting a new game – but even they probably have trouble applying what they’ve learned until they get a chance to play.
** Of course, different people use the system differently. In my case, I often used the V.A.T.S. to see where enemies might be lurking in the environment, because they would be outlined in red against a dark background.