The Relationship Between Trust and Use Choice in Human-Robot Interaction

Objective: To understand the influence of trust on use choice in human-robot interaction via experimental investigation. Background: The general assumption that trusting a robot leads to using that robot has been previously identified, often by asking participants to choose between manually completing a task or using an automated aid. Our work further evaluates the relationship between trust and use choice and examines factors impacting choice. Method: An experiment was conducted wherein participants rated a robot on a trust scale, then made decisions about whether to use that robotic agent or a human agent to complete a task. Participants provided explicit reasoning for their choices. Results: While we found statistical support for the “trust leads to use” relationship, qualitative results indicate other factors are important as well. Conclusion: Results indicated that while trust leads to use, use is also heavily influenced by the specific task at hand. Users more often chose a robot for a dangerous task where loss of life is likely, citing safety as their primary concern. Conversely, users chose humans for the mundane warehouse task, mainly citing financial reasons, specifically fear of job and income loss for the human worker. Application: Understanding the factors driving use choice is key to appropriate interaction in the field of human-robot teaming.

[1]  Regina A. Pomranky,et al.  The role of trust in automation reliance , 2003, Int. J. Hum. Comput. Stud..

[2]  Peter A. Hancock,et al.  Can You Trust Your Robot? , 2011 .

[3]  T. Kanda,et al.  Altered Attitudes of People toward Robots : Investigation through the Negative Attitudes toward Robots Scale ∗ , 2006 .

[4]  Jaap J. Dijkstra,et al.  User agreement with incorrect expert system advice , 1999, Behav. Inf. Technol..

[5]  Jeffrey V. Nickerson,et al.  A model for investigating the effects of machine autonomy on human behavior , 2004, 37th Annual Hawaii International Conference on System Sciences, 2004. Proceedings of the.

[6]  Linda G. Pierce,et al.  The Perceived Utility of Human and Automated Aids in a Visual Detection Task , 2002, Hum. Factors.

[7]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[8]  Wendy Ju,et al.  Beyond dirty, dangerous and dull: What everyday people think robots should do , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  James L. Szalma,et al.  A Meta-Analysis of Factors Influencing the Development of Trust in Automation , 2016, Hum. Factors.

[10]  N. Moray,et al.  Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. , 2000, Journal of experimental psychology. Applied.

[11]  I. Dey Qualitative Data Analysis: A User Friendly Guide for Social Scientists , 1993 .

[12]  Kerstin Dautenhahn,et al.  Would You Trust a (Faulty) Robot? Effects of Error, Task Type and Personality on Human-Robot Cooperation and Trust , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Joseph B. Lyons,et al.  Human–Human Reliance in the Context of Automation , 2012, Hum. Factors.

[14]  Richard E. Lucas,et al.  The mini-IPIP scales: tiny-yet-effective measures of the Big Five factors of personality. , 2006, Psychological assessment.

[15]  Susan G. Straus,et al.  All in due time: The development of trust in computer-mediated and face-to-face teams , 2006 .

[16]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[17]  Mark R. Lehto,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[18]  Raja Parasuraman,et al.  Sans subjectivity - ergonomics is engineering , 2002, Ergonomics.

[19]  Angela Forbes,et al.  Stress, social support and fear of disclosure , 1999 .

[20]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.