
By Tommy Thompson
AI & Game Senior Research Expert
YouTube @TableFlipGames
HyperAI Introduction
Humans finally beat AI in DOTA2.
Today, the knockout round of Ti8 (The International DOTA2 Championships), the world's top DOTA2 tournament, entered its third day. In addition to the regrettable elimination of the Chinese team VG, the subsequent OpenAI Five vs. Pain team became a highlight. Unlike the last time when OpenAI was defeated by Dendi, this time OpenAI Five lost in the first game, and the human team seemed to see a little hope again.
But in fact,OpenAIWhen training, they usually use "StarCraft" produced by Blizzard, which has always been a classic in the hearts of many players and its status is no less than "World of Warcraft".
As a real-time strategy game, relying on game mechanisms such as multi-arms collaboration and macro-strategy planning, countless possibilities can be generated in the same game, which is very helpful for training AI's decision-making ability and algorithm efficiency in uncertain scenarios. Today, this game has become an AI testing ground for well-known technology companies such as Google and Microsoft.
Real-time strategy games: the best games for training AI
Real-time strategy games (RTS) are a type of strategy game. Unlike turn-based games, this type of game can be played in real time. Players usually play the role of decision makers in the game, and need to perform macro operations such as deploying troops and formulating strategies. The most common real-time games include "StarCraft", "Warcraft", "Red Alert", etc.
This type of game usually has a fog mechanism, that is, all players have limited vision and can only see the scene around them. Players need to collect resources by exploring the world to build bases and create technology trees (such as synthesizing weapons through various materials, etc.). Only by successfully destroying enemy fortifications and eliminating enemies, or resisting all attacks, can they win.
During the game, how to reasonably allocate resources, formulate attack and defense plans, and the degree of understanding of the map will all become the key factors affecting the outcome of the battle.
This requires players to not only explore the unknown world, but also observe the battle status and formulate offensive and defensive strategies. In addition, the search space of this type of game is huge, and there may be countless possibilities in a game. Therefore, RTS games are very suitable for training AI's decision-making ability in uncertain scenarios and testing the efficiency of AI algorithms.
Training AI starts with conquering StarCraft
Among all RTS games, "StarCraft" is undoubtedly the best choice.
This game is well-known and has many players, which can provide a large amount of human player data needed to train AI. Moreover, this game is very complex and can effectively test the algorithm efficiency of AI models. Therefore, from a very early stage, there have been researchers who hope to train AI models through "StarCraft".
As a game played under incomplete information, StarCraft’s 128 x 128 map can create a huge search space. Currently, the search space for Go is about 10170, while StarCraft’s search space can reach 101685. It also allows the creation of up to 400 units of different types, each with different attributes, which can create countless possibilities in the game. Therefore, AI needs to run more algorithms to cope with it.
StarCraft 2017 Remastered
In addition, players from different camps will encounter each other at any time in the game, and AI needs to react within milliseconds. This requires AI to have the ability to remember and analyze information in real time, to integrate all random encounters or detected situations, and to constantly adjust game strategies. In addition, as a virtual scene, the trial and error cost in training is very low. These factors make "StarCraft" an ideal place for AI training.
How does AI learn?
ORTS:
An RTS battle platform built specifically for AI research
ORTS stands for Open Real Time Strategy, a real-time strategy game platform with an open RTS API. It was developed in 2003 by Michael Buro, a computer science professor at the University of Alberta in Canada, specifically for training AI models on RTS games.
This 3D RTS game platform equipped with OpenGL API can simulate most real-time strategy games on the market and can also process 3D images. ORTS's communication protocol and source code are all open to the public. Users can access a large number of maps, models, military data, etc. for free, and remotely build AI controllers for training AI models.
The Open Real-Time Strategy Game (ORTS) platform can be viewed on GitHubOrder
Get the link:https://github.com/benoit-dumas/OpenRTS
ORTS creates conditions for developers to create AI controllers in real-time strategy games, but it can only replicate the operating mechanism of existing RTS games and still cannot be directly connected to native games. Since there are few human players involved, ORTS only has AI players most of the time.
Finally, the story took a turn for the better in 2009. ORTS reached an agreement with Blizzard, and at the last RTS AI competition held by ORTS, an API was released - BWAPI (the official API of "StarCraft: Brood War"), which was also the first external AI API in the "StarCraft" series.
BWAPI:
You can create AI characters directly in "StarCraft"
BWAPI is an open source C++ framework, which means that all developers can train their own AI models in "StarCraft". BWAPI can provide AI models with game data of human players, as well as information such as the current game status, available troop types, and technology tree status.
In battle, in order to prevent AI players from cheating, the system stipulates that only when the enemy player is not blocked by fog and is within the field of vision of the AI model can the opponent's position, attributes and other information be obtained.
However, since the AI can perform more actions than a human player in a given frame, some strange behaviors may appear during gameplay, such as land soldiers walking on walls and buildings sliding around.
Although BWAPI still has defects in actual application, it is undeniable that it successfully realized the idea of training AI models in "StarCraft", promoted the development of AI, and laid the foundation for AIIDE (Artificial Intelligence and Interactive Digital Entertainment Conference) to hold the "StarCraft" competition.
AIIDE:
Let more people participate in the "StarCraft" AI competition
In 2010, AIIDE, one of the largest game AI research conferences in the world at the time, held the first "StarCraft AI Championship" in which only AI participated. The competition was divided into 4 games, each with a different focus, to show the performance of AI models in different actual combat scenarios.
-
The first and second games focus on the management of military units in complex terrain;
-
The third game uses a fog-free zone mode with full vision, but high-level arms are prohibited;
-
The fourth game is a regular confrontation, and you can use advanced soldiers even if there is fog;
The competition was a huge success, with 26 AI teams participating and 17 of them completing the entire competition. The first championship was won by the Zerg AI player Overmind, which was developed by a research team at the University of California.
It is worth mentioning that the developer of Overmind, Oriol Vinyals, a current Google DeepMind research scientist, was once the Spanish "StarCraft" champion. Although Overmind won the AIIDE competition, it was difficult to beat the human player Oriol Vinyals.
In 2011, as the number of participants increased, in order to ensure the fairness of the competition, AIIDE required AI players to make their source code public, and all competitions would be conducted on servers based on the C/S (client/server model) architecture.
At the second "StarCraft AI Competition" held this year, SkyNet from the UK won the championship with a strategy that focused on early defense and periodic charges. This AI player was developed by British engineer Andrew Smith alone, proving the feasibility of AI algorithms in strategy formulation.
By 2012, AIIDE competitions had already achieved permanent storage of game recordings, which meant that AI could learn by watching previous games. Today, AIIDE continues to hold “StarCraft” AI competitions, hoping to encourage more people to participate in the development of “StarCraft” AI models to promote the development of AI.
When AIIDE held its second competition in 2011, another "StarCraft" AI competition specifically targeting amateur AI players such as college students also quietly began.
SSCAIT:
Designed for students and amateur players
SSCAIT (StarCraft Student AI Competition) was founded in 2011 by Michal Certicky, a Slovakian PhD in philosophy. It is mainly for college students, graduate students and all players in the society who are interested in the StarCraft AI competition.
Through BWAPI, almost anyone can develop their own interstellar AI controller.
SSCAIT is more open than AIIDE, the competition system has also changed, and practice matches and live matches can be watched on Twitch. The game adopts 1-on-1 close combat. When one side's buildings are completely destroyed, or the program crashes and is unable to make a decision, it will be judged out.
Even though AI continues to improve in the "StarCraft" competition and performs better than humans in many other games, as of now, humans can still beat AI in the "StarCraft" man-machine battle.
In man-machine battle, humans still have a slight advantage
AIIDE once held a competition between AI and human players, with the top three players in the 2015 "StarCraft" AI competition facing Russian "StarCraft" professional player Djem5, but neither of them won.
In 2017, Sejong University in Seoul, South Korea also held a man-machine competition, in which four AI players faced off against Song Byung Gu, a Korean "StarCraft" professional player and one of the strongest professional players in the world.
The four AI players were MJ Bot, ZZZKBot and tscmoo from Sejong University and CherryPi from Facebook AI Lab. All four AI players were defeated within 27 minutes (the fastest was four and a half minutes).
Therefore, after creating Alpha Go, the DeepMind team at Google AI Lab set its sights on "StarCraft", hoping to defeat humans again.
Google DeepMind launches official StarCraft 2 API
In August 2017, DeepMind officially announced its partnership with Blizzard to jointly launch the official AI API for "StarCraft II" - SC2LE (translated as: StarCraft 2 AI Research Environment).
This is a tool set designed for developers to study AI, but it is only applicable to "StarCraft II". While retaining many BWAPI features, some new features have been added:
-
You can use Python language to build AI models in the game;
-
You can watch game replays;
-
Game progress can be accelerated (very useful for training AI);
-
Custom maps can be built;
-
Seven mini-games developed by DeepMind have been added for functional testing of AI models;
-
More human player game data can be provided to the AI model.

SC2LE Captured game features
SC2LE is divided into two different sets, one is similar to BWAPI, which can access specific information under a given framework. The other is specially prepared for AI algorithms such as machine learning, which can analyze the information obtained in the game and make it easier for the algorithm to access this data.
Although SC2LE can allow more people to participate in the creation of AI game models, it is still difficult for AI to defeat humans in real-time strategy games such as "StarCraft". However, with the participation of internationally renowned technology companies such as Google, Microsoft, and Facebook, all this may be reversed.
In June of this year, OpenAI, an AI non-profit organization co-founded by Elon Musk and many other Silicon Valley tycoons, defeated professional human players in the multiplayer real-time battle game "DOTA 2" and defeated semi-professional players in August.
"DOTA" was developed based on the game engine of "World of Warcraft III". Like "StarCraft", "World of Warcraft" is also an RTS game developed by Blizzard. Although "DOTA 2" has been separated from the engine, some game mechanisms are still retained.
Although the game mechanism of "World of Warcraft" is more complex than that of "DOTA 2" and the map range is larger, AI's victory in "DOTA 2" proves that AI has taken another step forward in real-time strategy games. In the future, with the participation of more large technology companies, AI may create new game mechanisms, allowing existing competitive games to become battles between humans and AI.
Come play a round and review today's knowledge points