Demis Hassabis, cofounder of Google-owned firm DeepMind, and Jeff Dean, who leads the Google Brain project, have both hinted that StarCraft will be their next target, while Facebook researchers have just released an open-source platform designed to help people develop AI to play the game.
Succeeding in StarCraft would be a show of strategic strength. AI’s gaming prowess reached new heights in March when DeepMind’s AlphaGo system defeated one of the world’s best Go players, Lee Sedol. The AI’s ability to win at Go was particularly impressive owing to the complexity of the ancient Chinese game. There are more possible moves in Go than there are atoms in the universe, so AlphaGo couldn’t work out its strategy simply by “solving” the game. Instead, its neural networks were trained using a database of 30 million moves made by expert human players. The software could then evaluate how each potential move in a real-life game would alter its overall chance of victory, allowing it to choose the best one.
Starcraft is a popular video game which involves building huge armies to battle against other players over a large virtual terrain. Players can’t see exactly what their opponents are up to, so they have to make decisions based on incomplete information – just like in the real world. Mastering the chaos in StarCraft will therefore have implications beyond video games: it should improve AI’s ability to deal with reality.