Teaching machines the art of war
THE war between man and machine is coming.
In a timely coincidence, James Cameron’s classic film ‘Terminator 2: Judgment Day’ has been re-released to theatres in 3D just as Blizzard Entertainment and Google’s DeepMind have announced the release of their application programming interface (API) that will give artificial intelligence (AI) researchers to ability to create machine learning bots capable of tackling Blizzard’s popular esports title - Starcraft 2.
Earlier this year, DeepMind’s AI program AlphaGo easily defeated world champion ‘Go’ player Lee Sedol, displaying the already impressive ability of artificial intelligence to master the complex strategy game invented in ancient China.
Now DeepMind have their sights set on a far, far greater challenge, which is an exciting prospect, even if it does invoke troubling comparisons to Terminator 2’s omnipresent villain ‘Skynet’.
For those unfamiliar with Blizzard’s hit video game series, Starcraft 2 is a science fiction military strategy game.
And just like Go, it involves at least two opposing players vying for control of territory in order to defeat their opponent - but that is where the similarities end.
For the purposes of artificial intelligence training, Go is a so-called ‘perfect information’ game, which means you can see all your opponents moves, and those moves are finite, even if the number of potential moves is one to the power of 170.
With the right hardware, it’s possible to teach a computer all the possible moves given enough time.
Starcraft 2, on the other hand, is infinitely more complex.
For one thing, ‘moves’ happen simultaneously in real time.
For another, a player can’t see their opponent’s moves unless they move a scout into their territory, and even then, interpretation of those moves is required.
Lastly, the game pieces, or ‘units’, each have specific strengths, weaknesses and abilities, and those strengths and weaknesses are augmented by the terrain they are occupying compared to opposing units.
In essence, Starcraft 2 consists of almost all the complexity a computer would face if it were a real military strategist commanding forces in a real battle.
The sheer amount of possibilities, moves, and countermoves is dizzying to comprehend for the purpose of creating a computer program capable of learning how to play.
Which is why DeepMind and other AI researchers aren’t expecting that a computer will be able to defeat a human player for at least five years.
It also means that researchers can’t simply place the AI in a real, full-fledged game of Starcraft from the outset and expect it to learn efficiently, if at all.
Instead, just like a human player might begin to learn to play, DeepMind and Blizzard have broken the game down into mini-games involving individual tasks and aspects of a full match.
To begin with, the AI bots are simply being tasked with collecting resources on the map required to build units, or to move a unit to a particular point on the map.
So the good news is we’re a while away from needing to worry about Google inadvertently unleashing an army of Terminators on us all.
And the other good news is that if artificial intelligence becomes good enough to defeat human players in Starcraft 2, it will mean we’ve reached a point in machine learning that will open up exciting possibilities and applications beyond merely playing strategy games.
In the words of Starcraft 2 character Tychus Findley, ‘Hell, it’s about time.’
JUDGEMENT DAY: DeepMind and Blizzard Entertainment has released programming tools to teach machine learning bots to play Starcraft 2.