It's (Not) Your Fault! Blame and Trust Repair in Human-Agent Cooperation

In cooperative settings the success of the team is interlinked with the performance of the individual members. Thus, the possibility to address problems and mistakes of team members needs to be given. A common means in human-human interaction is the attribution of blame. Yet, it is not clear how blame attributions aect cooperation between humans and intelligent virtual agents and the overall perception of the agent. In order to take a first step in answering these questions, a study on cooperative human-agent interaction was conducted. The study was designed to investigate the effects of two different blaming strategies used by the agent in response to an alleged goal achievement failure, that is, self-blame (agent blames itself) followed by an apology versus other-blame (agent blames the user). The results indicate that the combination of blame and trust repair enables a successful continuation of the cooperation without loss of trust and likeability.

[1]  Dacher Keltner,et al.  Evidence for the Distinctness of Embarrassment, Shame, and Guilt: A Study of Recalled Antecedents and Facial Expressions of Emotion , 1996 .

[2]  Samuel S. Monfort,et al.  Almost human: Anthropomorphism increases trust resilience in cognitive agents. , 2016, Journal of experimental psychology. Applied.

[3]  Bertram F. Malle,et al.  A Social-Conceptual Map of Moral Criticism , 2014, CogSci.

[4]  Nicole C. Krämer,et al.  Social Effects of Virtual and Robot Companions , 2015 .

[5]  Amy J. C. Cuddy,et al.  Universal dimensions of social cognition: warmth and competence , 2007, Trends in Cognitive Sciences.

[6]  Aaron Steinfeld,et al.  Effects of blame on trust in human robot interaction , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Stefan Kopp,et al.  Prototyping User Interfaces for Investigating the Role of Virtual Agents in Human-Machine Interaction - A Demonstration in the Domain of Cooperative Games , 2015, IVA.

[8]  D. Ferrin,et al.  When more blame is better than less: The implications of internal vs. external attributions for the repair of trust after a competence- vs. integrity-based trust violation , 2006 .

[9]  S. Sundar,et al.  The Handbook of the Psychology of Communication Technology , 2015 .

[10]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[11]  Ass,et al.  Can computers be teammates? , 1996 .

[12]  Marcel Zeelenberg,et al.  The role of interpersonal harm in distinguishing regret from guilt. , 2008, Emotion.

[13]  Clifford Nass,et al.  Critic, compatriot, or chump?: Responses to robot blame attribution , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Youngme Moon,et al.  Don’t Blame the Computer: When Self-Disclosure Moderates the Self-Serving Bias , 2003 .

[15]  B. Malle,et al.  People Systematically Update Moral Judgments of Blame , 2018, Journal of personality and social psychology.

[16]  B. J. Fogg,et al.  The elements of computer credibility , 1999, CHI '99.

[17]  Maarten Sierhuis,et al.  Human-agent-robot teamwork , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  K. Hill,et al.  Trust me or not? , 2001, Nursing management.

[19]  Cynthia Breazeal,et al.  Computationally modeling interpersonal trust , 2013, Front. Psychol..

[20]  M. Guha The Oxford Companion to Emotion and the Affective Sciences , 2010 .

[21]  C. Nass,et al.  Trust in Computers: The Computers-Are-Social-Actors (CASA) Paradigm and Trustworthiness Perception in Human-Computer Communication , 2010 .

[22]  G. Pagnoni,et al.  The neural correlates of the affective response to unreciprocated cooperation , 2008, Neuropsychologia.

[23]  Marcel Zeelenberg,et al.  Trust me (or not): Regret and disappointment in experimental economic games. , 2015 .