Comparing mind perception in strategic exchanges: human-agent negotiation, dictator and ultimatum games

Recent research shows that how we respond to other social actors depends on what sort of mind we ascribe to them. In a comparative manner, we observed how perceived minds of agents shape people’s behavior in the dictator game, ultimatum game, and negotiation against artificial agents. To do so, we varied agents’ minds on two dimensions of the mind perception theory: agency (cognitive aptitude) and patiency (affective aptitude) via descriptions and dialogs. In our first study, agents with emotional capacity garnered more allocations in the dictator game, but in the ultimatum game, agents’ described agency and affective capacity, both led to greater offers. In the second study on negotiation, agents ascribed with low-agency traits earned more points than those with high-agency traits, though the negotiation tactic was the same for all agents. Although patiency did not impact game points, participants sent more happy and surprise emojis and emotionally valenced messages to agents that demonstrated emotional capacity during negotiations. Further, our exploratory analyses indicate that people related only to agents with perceived affective aptitude across all games. Both perceived agency and affective capacity contributed to moral standing after dictator and ultimatum games. But after negotiations, only agents with perceived affective capacity were granted moral standing. Manipulating mind dimensions of machines has differing effects on how people react to them in dictator and ultimatum games, compared to a more complex economic exchange like negotiation. We discuss these results, which show that agents are perceived not only as social actors, but as intentional actors through negotiations, in contrast with simple economic games.

[1]  David DeVault,et al.  Negotiation as a Challenge Problem for Virtual Humans , 2015, IVA.

[2]  D. Premack,et al.  Does the chimpanzee have a theory of mind? , 1978, Behavioral and Brain Sciences.

[3]  J. Haidt,et al.  The Moral Emotions , 2009 .

[4]  B. J. Fogg,et al.  How users reciprocate to computers: an experiment that demonstrates behavior change , 1997, CHI Extended Abstracts.

[5]  A. Aron,et al.  Inclusion of Other in the Self Scale and the structure of interpersonal closeness , 1992 .

[6]  Amy J. C. Cuddy,et al.  A model of (often mixed) stereotype content: competence and warmth respectively follow from perceived status and competition. , 2002, Journal of personality and social psychology.

[7]  C. Engel Dictator games: a meta study , 2010 .

[8]  Johannes Hewig,et al.  An electrophysiological analysis of coaching in Blackjack , 2008, Cortex.

[9]  C. D. De Dreu,et al.  The interpersonal effects of emotions in negotiations: a motivated information processing approach. , 2004, Journal of personality and social psychology.

[10]  D. G. Pruitt,et al.  Reward structure and cooperation: the decomposed Prisoner's Dilemma game. , 1967, Journal of personality and social psychology.

[11]  Catholijn M. Jonker,et al.  When Will Negotiation Agents Be Able to Represent Us? The Challenges and Opportunities for Autonomous Negotiators , 2017, IJCAI.

[12]  F. Ciardo,et al.  Do We Adopt the Intentional Stance Toward Humanoid Robots? , 2019, Front. Psychol..

[13]  Cynthia Breazeal,et al.  Persuasive Robotics: The influence of robot gender on human behavior , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Brian A. Nosek,et al.  Mapping the moral domain. , 2011, Journal of personality and social psychology.

[15]  Kurt Gray,et al.  Mind Perception Is the Essence of Morality , 2012, Psychological inquiry.

[16]  Nicole C. Krämer,et al.  Human-Agent and Human-Robot Interaction Theory: Similarities to and Differences from Human-Human Interaction , 2012, Human-Computer Interaction: The Agency Perspective.

[17]  Kim Veltman,et al.  Training the use of theory of mind using artificial agents , 2019, Journal on Multimodal User Interfaces.

[18]  C. Nass,et al.  How “Real” Are Computer Personalities? , 1996 .

[19]  D. Dennett Kinds of Minds: Towards an Understanding of Consciousness , 1996 .

[20]  D. Wegner,et al.  Causes and consequences of mind perception , 2010, Trends in Cognitive Sciences.

[21]  J. Horowitz,et al.  Fairness in Simple Bargaining Experiments , 1994 .

[22]  Adrian F. Ward,et al.  Journal of Experimental Psychology: General The Myth of Harmless Wrongs in Moral Cognition: Automatic Dyadic Completion From Sin to Suffering , 2014 .

[23]  Jonathan Gratch,et al.  Humans versus Computers: Impact of Emotion Expressions on People's Decision Making , 2015, IEEE Transactions on Affective Computing.

[24]  Louis-Philippe Morency,et al.  It's only a computer: Virtual humans increase willingness to disclose , 2014, Comput. Hum. Behav..

[25]  Jonathan Gratch,et al.  People show envy, not guilt, when making decisions with machines , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[26]  J. Cacioppo,et al.  On seeing human: a three-factor theory of anthropomorphism. , 2007, Psychological review.

[27]  G. A. Kleef,et al.  I laughed, I cried, I settled: The role of emotions in negotiation , 2004 .

[28]  Nicole C. Krämer,et al.  Theory of Mind as a Theoretical Prerequisite to Model Communication with Virtual Humans , 2006, ZiF Workshop.

[29]  Jonathan Gratch,et al.  Shaping Coopera(cid:415)on between Humans and Agents with Emo(cid:415)on Expressions and Framing Socially Interac(cid:415)ve Agents Track , 2022 .

[30]  Alan G. Sanfey,et al.  Affective state and decision-making in the Ultimatum Game , 2006, Experimental Brain Research.

[31]  N. Haslam Morality, Mind, and Humanness , 2012 .

[32]  Jonathan Gratch,et al.  An Effective Conversation Tactic for Creating Value over Repeated Negotiations , 2015, AAMAS.

[33]  Jonathan D. Cohen,et al.  The Neural Basis of Economic Decision-Making in the Ultimatum Game , 2003, Science.

[34]  D. Wegner,et al.  Feeling robots and human zombies: Mind perception and the uncanny valley , 2012, Cognition.

[35]  Jonathan Gratch,et al.  The Importance of Cognition and Affect for Artificially Intelligent Decision Makers , 2014, AAAI.

[36]  Crystal L. Hoyt,et al.  Immersive Virtual Environment Technology as a Methodological Tool for Social Psychology , 2002 .

[37]  Andrew D. Engell,et al.  The Neural Bases of Cognitive Conflict and Control in Moral Judgment , 2004, Neuron.

[38]  Peter H. Ditto,et al.  Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism , 2012 .

[39]  Bart Verheij,et al.  Negotiating with other minds: the role of recursive theory of mind in negotiation with incomplete information , 2015, Autonomous Agents and Multi-Agent Systems.

[40]  B. Frey,et al.  Social Distance and Other-Regarding Behavior in Dictator Games: Comment , 1999 .

[41]  Jonathan Gratch,et al.  Reading people's minds from emotion expressions in interdependent decision making. , 2014, Journal of personality and social psychology.

[42]  Geoffrey P. Goodwin,et al.  Cruel nature: Harmfulness as an important, overlooked dimension in judgments of moral standing , 2014, Cognition.

[43]  Jonathan Gratch,et al.  What's on Your Virtual Mind?: Mind Perception in Human-Agent Negotiations , 2019, IVA.

[44]  Joshua D. Greene,et al.  How (and where) does moral judgment work? , 2002, Trends in Cognitive Sciences.

[45]  H. Oosterbeek,et al.  Cultural Differences in Ultimatum Game Experiments: Evidence from a Meta-Analysis , 2001 .

[46]  D. Wegner,et al.  Dimensions of Mind Perception , 2007, Science.

[47]  L. Floridi,et al.  The Ethics of Information , 2013, Dialogue.

[48]  Paul Bloom,et al.  Against empathy: The case for rational compassion , 2018 .

[49]  J. Loomis,et al.  Interpersonal Distance in Immersive Virtual Environments , 2003, Personality & social psychology bulletin.

[50]  Jessica E. Black,et al.  Development, reliability, and validity of the Moral Identity Questionnaire , 2016 .

[51]  Clifford Nass,et al.  Computers are social actors , 1994, CHI '94.

[52]  Melvin J. Kimmel,et al.  Twenty Years of Experimental Gaming: Critique,Synthesis, and Suggestions for the Future , 1977 .

[53]  M. Khamitov,et al.  Perceiving the agency of harmful agents: A test of dehumanization versus moral typecasting accounts , 2016, Cognition.

[54]  Jonathan Gratch,et al.  Prestige Questions, Online Agents, and Gender-Driven Differences in Disclosure , 2017, IVA.

[55]  N. Eisenberg,et al.  The Role of Reported Emotion in Real-Life and Hypothetical Moral Dilemmas , 2002 .

[56]  W. Güth,et al.  An experimental analysis of ultimatum bargaining , 1982 .

[57]  D. G. Pruitt,et al.  NEGOTIATION AND MEDIATION , 1992 .

[58]  Jonathan Gratch,et al.  Pinocchio : Answering Human-Agent Negotiation Questions through Realistic Agent Design , 2017 .

[59]  Dotsch,et al.  Virtual Prejudice , 2017 .

[60]  D. Dennett The Intentional Stance. , 1987 .

[61]  Kazuhiko Kawamura,et al.  Tests of Concepts About Different Kinds of Minds: Predictions About the Behavior of Computers, Robots, and People , 2012, Hum. Comput. Interact..

[62]  L. Thompson Negotiation behavior and outcomes: Empirical evidence and theoretical issues. , 1990 .