Exploring Gameplay With AI Agents

The process of playtesting a game is subjective, expensive and incomplete. In this paper, we present a playtesting approach that explores the game space with automated agents and collects data to answer questions posed by the designers. Rather than have agents interacting with an actual game client, this approach recreates the bare bone mechanics of the game as a separate system. Our agent is able to play in minutes what would take testers days of organic gameplay. The analysis of thousands of game simulations exposed imbalances in game actions, identified inconsequential rewards and evaluated the effectiveness of optional strategic choices. Our test case game, The Sims Mobile, was recently released and the findings shown here influenced design changes that resulted in improved player experience.

[1]  Michael Mateas,et al.  Tanagra: a mixed-initiative level design tool , 2010, FDG.

[2]  Antonios Liapis,et al.  Mixed-initiative co-creativity , 2014, FDG.

[3]  Zoran Popovic,et al.  Evaluating Competitive Game Balance with Restricted Play , 2012, AIIDE.

[4]  Christoph Salge,et al.  Relevant Information as a formalised approach to evaluate game mechanics , 2010, Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games.

[5]  Julian Togelius,et al.  Generating beginner heuristics for simple texas hold'em , 2018, GECCO.

[6]  Mark J. Nelson Game Metrics Without Players: Strategies for Understanding Game Artifacts , 2011, Artificial Intelligence in the Game Design Process.

[7]  Julian Togelius,et al.  Evolving maps and decks for ticket to ride , 2018, FDG.

[8]  J. Togelius,et al.  Discovering Unique Game Variants , 2015 .

[9]  Joe Marks,et al.  Automatic Design of Balanced Board Games , 2007, AIIDE.

[10]  Joris Dormans,et al.  Simulating Mechanics to Study Emergence in Games , 2011, IDP@AIIDE.

[11]  Julian Togelius,et al.  Generating heuristics for novice players , 2016, 2016 IEEE Conference on Computational Intelligence and Games (CIG).

[12]  Julian Togelius,et al.  Generating Novice Heuristics for Post-Flop Poker , 2018, 2018 IEEE Conference on Computational Intelligence and Games (CIG).

[13]  Julian Togelius,et al.  Sentient Sketchbook: Computer-aided game level authoring , 2013, FDG.

[14]  Julian Togelius,et al.  General Video Game Evaluation Using Relative Algorithm Performance Profiles , 2015, EvoApplications.

[15]  Julian Togelius,et al.  Evolving card sets towards balancing dominion , 2012, 2012 IEEE Congress on Evolutionary Computation.

[16]  Julian Togelius,et al.  AI-based playtesting of contemporary board games , 2017, FDG.

[17]  Michael Mateas,et al.  LUDOCORE: A logical game engine for modeling videogames , 2010, Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games.

[18]  Andrew Nealen,et al.  Exploring Game Space Using Survival Analysis , 2015, FDG.

[19]  Julian Togelius,et al.  AI as Evaluator: Search Driven Playtesting of Modern Board Games , 2017, AAAI Workshops.

[20]  Julian Togelius,et al.  Ropossum: An Authoring Tool for Designing, Optimizing and Solving Cut the Rope Levels , 2013, AIIDE.

[21]  Cameron Browne,et al.  Automatic generation and evaluation of recombination games , 2008 .

[22]  Frédéric Maire,et al.  Evolutionary Game Design , 2011, IEEE Transactions on Computational Intelligence and AI in Games.