Hierarchical Adversarial Search Applied to Real-Time Strategy Games

Real-Time Strategy (RTS) video games have proven to be a very challenging application area for artificial intelligence research. Existing AI solutions are limited by vast state and action spaces and real-time constraints. Most implementations efficiently tackle various tactical or strategic sub-problems, but there is no single algorithm fast enough to be successfully applied to big problem sets (such as a complete instance of the StarCraft RTS game). This paper presents a hierarchical adversarial search framework which more closely models the human way of thinking — much like the chain of command employed by the military. Each level implements a different abstraction — from deciding how to win the game at the top of the hierarchy to individual unit orders at the bottom. We apply a 3-layer version of our model to SparCraft — a Star-Craft combat simulator — and show that it outperforms state of the art algorithms such as Alpha-Beta, UCT, and Portfolio Search in large combat scenarios featuring multiple bases and up to 72 mobile units per player under real-time constraints of 40 ms per search episode.

[1]  Gabriel Synnaeve,et al.  Programmation et apprentissage bayésien pour les jeux vidéo multi-joueurs, application à l'intelligence artificielle de jeux de stratégies temps-réel. (Bayesian Programming and Learning for Multi-Player Video Games, Application to RTS AI) , 2012 .

[2]  Dana S. Nau,et al.  Computer Bridge - A Big Win for AI Planning , 1998, AI Mag..

[3]  Santiago Ontañón,et al.  The Combinatorial Multi-Armed Bandit Problem and Its Application to Real-Time Strategy Games , 2013, AIIDE.

[4]  Michael Buro,et al.  Fast Heuristic Search for RTS Game Combat Scenarios , 2012, AIIDE.

[5]  Michael Buro,et al.  Portfolio greedy search and simulation for large-scale combat in starcraft , 2013, 2013 IEEE Conference on Computational Inteligence in Games (CIG).

[6]  Branislav Bosanský,et al.  Agent subset adversarial search for complex non-cooperative domains , 2010, Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games.

[7]  Branislav Bosanský,et al.  Adversarial search with procedural knowledge heuristic , 2009, AAMAS.

[8]  Michael Buro,et al.  Incorporating Search Algorithms into RTS Game Agents , 2012 .

[9]  Santiago Ontañón,et al.  A Survey of Real-Time Strategy Game AI Research and Competition in StarCraft , 2013, IEEE Transactions on Computational Intelligence and AI in Games.

[10]  Tristan Cazenave,et al.  Planning and Execution Control Architecture for Infantry Serious Gaming , 2013 .

[11]  Jean-Christophe Weill The ABDADA Distributed Minimax-Search Algorithm , 1996, J. Int. Comput. Games Assoc..

[12]  Michael Buro,et al.  Build Order Optimization in StarCraft , 2011, AIIDE.

[13]  Michael Buro,et al.  Call for AI Research in RTS Games , 2004 .

[14]  Kenrick J. Mock Hierarchical Heuristic Search Techniques for Empire-based Games , 2002, IC-AI.

[15]  Michael Buro,et al.  Real-Time Strategy Game Competitions , 2012, AI Mag..

[16]  Pierre Bessière,et al.  Special tactics: A Bayesian approach to tactical decision-making , 2012, 2012 IEEE Conference on Computational Intelligence and Games (CIG).

[17]  Keith D. Rogers,et al.  A Micromanagement Task Allocation System for Real-Time Strategy Games , 2014, IEEE Transactions on Computational Intelligence and AI in Games.

[18]  Alan Bundy,et al.  Applying adversarial planning techniques to Go , 2001, Theor. Comput. Sci..

[19]  Michael Buro,et al.  Predicting Army Combat Outcomes in StarCraft , 2013, AIIDE.

[20]  Michael Buro,et al.  Adversarial Planning Through Strategy Simulation , 2007, 2007 IEEE Symposium on Computational Intelligence and Games.

[21]  Jonathan Schaeffer,et al.  Monte Carlo Planning in RTS Games , 2005, CIG.

[22]  John E. Laird,et al.  SORTS: A Human-Level Approach to Real-Time Strategy AI , 2007, AIIDE.