Blizzard made a very curious announcement about Starcraft II at BlizzCon 2016. Instead of a new expansion pack, the game is instead being opened up to Google’s Deepmind project; and will teach the AI system how to play an RTS.
Deepmind made headlines earlier this year when its AlphaGo AI managed to beat a world class Go player; a feat that was believed to be impossible. The number of possible actions in Go was originally thought to be too great for a computer to calculate within the time constraints of a professional match. Despite this, Deepmind pulled off a 4 – 1 victory over Lee Sedol.
Opening Deepmind to Starcraft II introduces a whole new level of complications and problems for the research staff. For one, the game offers a wider range of choices as compared to a board game. Additionally, both players actions are concealed from each other; which forces players to scout enemies and memorise what they saw.
Deepmind’s project is not yet ready to play against other humans – or Starcraft II’s built in AI – just yet. For now, the researchers are working with Blizzard to design learning maps and scenarios to teach the AI how to play the game.
The project may take a couple of years to complete, but we could be seeing the rise of machines capable of an even more complex decision making process. It might even end up beating the Koreans in Starcraft II; which would be a massive feat on its own.