Trust in Computers and Robots: The Uses and Boundaries of the Analogy to Interpersonal Trust

Trust is a complex concept having many meanings and hinting at many variables, and is not a single concept, or state, or continuum. Panelists will briefly argue their stances concerning concepts of trust in automation, and whether (or to what extent) our understanding of trust in automation should be addressed by analogy to interpersonal trust. There is considerable divergence of opinion on these matters, and on the question of whether it is possible for robots to engage in trustworthy relations with humans.

[1]  Nicole C. Krämer,et al.  Human-Agent and Human-Robot Interaction Theory: Similarities to and Differences from Human-Human Interaction , 2012, Human-Computer Interaction: The Agency Perspective.

[2]  M. Gottlieb,et al.  Nonrational processes in ethical decision making. , 2011, The American psychologist.

[3]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[4]  Maarten Sierhuis,et al.  Beyond Cooperative Robotics: The Central Role of Interdependence in Coactive Design , 2011, IEEE Intelligent Systems.

[5]  Kenneth Littlejohn,et al.  Accounting for the human in cyberspace: Effects of mood on trust in automation , 2010, 2010 International Symposium on Collaborative Technologies and Systems.

[6]  Jeffrey M. Bradshaw,et al.  The Dynamics of Trust in Cyberdomains , 2009, IEEE Intelligent Systems.

[7]  D. Kahneman,et al.  Conditions for intuitive expertise: a failure to disagree. , 2009, The American psychologist.

[8]  Robert R. Hoffman,et al.  Metrics, Metrics, Metrics: Negative Hedonicity , 2008, IEEE Intelligent Systems.

[9]  Brent A. Scott,et al.  Trust, trustworthiness, and trust propensity: a meta-analytic test of their unique relationships with risk taking and job performance. , 2007, The Journal of applied psychology.

[10]  J. H. Davis,et al.  An Integrative Model of Organizational Trust: Past, Present, and Future , 2007 .

[11]  Christopher D. Wickens,et al.  The benefits of imperfect diagnostic automation: a synthesis of the literature , 2007 .

[12]  Erik Hollnagel,et al.  Joint Cognitive Systems: Patterns in Cognitive Systems Engineering , 2006 .

[13]  M. Schweitzer,et al.  Feeling and believing: the influence of emotion on trust. , 2003, Journal of personality and social psychology.

[14]  Izak Benbasat,et al.  Comparing Customer Trust in Virtual Salespersons With Customer Trust in Human Salespersons , 2005, Proceedings of the 38th Annual Hawaii International Conference on System Sciences.

[15]  Barbara D. Adams,et al.  CREATING A MEASURE OF TRUST IN SMALL MILITARY TEAMS , 2004 .

[16]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[17]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[18]  Deepak Malhotra,et al.  NORMAL ACTS OF IRRATIONAL TRUST: MOTIVATED ATTRIBUTIONS AND THE TRUST DEVELOPMENT PROCESS , 2004 .

[19]  C. Nass,et al.  Machines and Mindlessness , 2000 .

[20]  C. Nass,et al.  Are People Polite to Computers? Responses to Computer-Based Interviewing Systems1 , 1999 .

[21]  Rajeev R. Bhattacharya,et al.  A Formal Model of Trust Based on Outcomes , 1998 .

[22]  R. R. Hoffman Whom (or what) do you (mis)trust?: historical reflections on the psychology and sociology of information technology , 1998, Proceedings Fourth Annual Symposium on Human Interaction with Complex Systems.

[23]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[24]  J. H. Davis,et al.  An Integrative Model Of Organizational Trust , 1995 .