Preparing for the aftermath: Using emotional agents in game-based training for disaster response

Ground truth, a training game developed by Sandia National Laboratories in partnership with the University of Southern California GamePipe Lab, puts a player in the role of an incident commander working with teammate agents to respond to urban threats. These agents simulate certain emotions that a responder may feel during this high-stress situation. We construct psychology-plausible models compliant with the Sandia Human Embodiment and Representation Cognitive Architecture (SHERCA) that are run on the sandia cognitive runtime engine with active memory (SCREAM) software. SCREAM's computational representations for modeling human decision-making combine aspects of ANNs and fuzzy logic networks. This paper gives an overview of ground truth and discusses the adaptation of the SHERCA and SCREAM into the game. We include a semiformal description of SCREAM.

[1]  Patrick G. Xavier,et al.  The Umbra simulation framework as applied to building HLA federates , 2002, Proceedings of the Winter Simulation Conference.

[2]  Chris Forsythe,et al.  Surety of Human Elements of High Consequence Systems: An Organic Model , 2000 .

[3]  Takayasu Ito,et al.  Theoretical Computer Science: Exploring New Frontiers of Theoretical Informatics , 2001, Lecture Notes in Computer Science.

[4]  A. Bandura,et al.  Mechanisms of Moral Disengagement in the Exercise of Moral Agency , 1996 .

[5]  Mihalis Yannakakis,et al.  Hierarchical State Machines , 2000, IFIP TCS.

[6]  Talib S. Hussain,et al.  Flexible and Purposeful NPC Behaviors using Real-Time Genetic Control , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[7]  I. Ajzen,et al.  Prediction of goal directed behaviour: Attitudes, intentions and perceived behavioural control , 1986 .

[8]  Leonard Berkowitz,et al.  Pain and aggression: Some findings and implications , 1993 .

[9]  Michael Wooldridge,et al.  An Introduction to MultiAgent Systems John Wiley & Sons , 2002 .

[10]  C. Lebiere,et al.  The Atomic Components of Thought , 1998 .

[11]  Sandia Report,et al.  The Effects of Emotional States and Traits on Risky Decision-Making , 2006 .

[12]  Matthew R. Glickman,et al.  Psychologically Plausible Cognitive Models for Simulating Interactive Human Behaviors , 2005 .

[13]  John E. Laird,et al.  Toward a Comprehensive Computational Model of Emotions and Feelings , 2004, ICCM.

[14]  Stacy Marsella,et al.  Building Interactive Virtual Humans for Training Environments , 2007 .

[15]  Robert G. Abbott,et al.  Using Psychologically Plausible Operator Cognitive Models to Enhance Operator Performance , 2003 .

[16]  Stephen Joseph Verzi,et al.  Simulating human behavior for national security human interactions. , 2007 .

[17]  Barbara Messing,et al.  An Introduction to MultiAgent Systems , 2002, Künstliche Intell..

[18]  Martin Fishbein,et al.  The Role of Desires, Self‐Predictions, and Perceived Control in the Prediction of Training Session Attendance1 , 1990 .

[19]  Justin Basilico,et al.  Copy of The Cognitive Foundry: A Flexible Platform for Intelligent Agent Modeling. , 2008 .

[20]  I. Ajzen,et al.  A Comparison of the Theory of Planned Behavior and the Theory of Reasoned Action , 1992 .

[21]  Abdennour El Rhalibi,et al.  IPD for emotional NPC societies in games , 2004, ACE '04.

[22]  Randolph M. Jones,et al.  Behaviors that emerge from emotion and cognition: implementation and evaluation of a symbolic-connectionist architecture , 2003, AAMAS '03.

[23]  John E. Laird,et al.  The soar papers : research on integrated intelligence , 1993 .