Judging One’s Own or Another Person’s Responsibility in Interactions With Automation

OBJECTIVE We explore users' and observers' subjective assessments of human and automation capabilities and human causal responsibility for outcomes. BACKGROUND In intelligent systems and advanced automation, human responsibility for outcomes becomes equivocal, as do subjective perceptions of responsibility. In particular, actors who actively work with a system may perceive responsibility differently from observers. METHOD In a laboratory experiment with pairs of participants, one participant (the "actor") performed a decision task, aided by an automated system, and the other (the "observer") passively observed the actor. We compared the perceptions of responsibility between the two roles when interacting with two systems with different capabilities. RESULTS Actors' behavior matched the theoretical predictions, and actors and observers assessed the system and human capabilities and the comparative human responsibility similarly. However, actors tended to relate adverse outcomes more to system characteristics than to their own limitations, whereas the observers insufficiently considered system capabilities when evaluating the actors' comparative responsibility. CONCLUSION When intelligent systems greatly exceed human capabilities, users may correctly feel they contribute little to system performance. They may interfere more than necessary, impairing the overall performance. Outside observers, such as managers, may overweigh users' contribution to outcomes, holding users responsible for adverse outcomes when they rightly trusted the system. APPLICATION Presenting users of intelligent systems and others with performance measures and the comparative human responsibility may help them calibrate subjective assessments of performance, reducing users' and outside observers' biases and attribution errors.

[1]  Deborah G. Johnson,et al.  Negotiating autonomy and responsibility in military robots , 2013, Ethics and Information Technology.

[2]  P. Andrews,et al.  The psychology of social chess and the evolution of attribution mechanisms: explaining the fundamental attribution error. , 2001, Evolution and human behavior : official journal of the Human Behavior and Evolution Society.

[3]  Andrew L. Geers,et al.  Illusory Causation: Why It Occurs , 2002, Psychological science.

[4]  Joachim Meyer,et al.  Measures of Reliance and Compliance in Aided Visual Scanning , 2014, Hum. Factors.

[5]  D. Wiegmann,et al.  Similarities and differences between human–human and human–automation trust: an integrative review , 2007 .

[6]  H. Markus,et al.  Culture and the self: Implications for cognition, emotion, and motivation. , 1991 .

[7]  Kunio Doi,et al.  Computer-aided diagnosis in medical imaging: Historical review, current status and future potential , 2007, Comput. Medical Imaging Graph..

[8]  M. Frese,et al.  Organizational error management culture and its impact on performance: a two-study replication. , 2005, The Journal of applied psychology.

[9]  Jason S. McCarley,et al.  Benchmarking Aided Decision Making in a Signal Detection Task , 2017, Hum. Factors.

[10]  R. Nisbett,et al.  Causal attribution across cultures: Variation and universality. , 1999 .

[11]  Madeleine Clare Elish,et al.  Praise the Machine! Punish the Human! The Contradictory History of Accountability in Automated Aviation , 2015 .

[12]  Peter-Paul van Maanen,et al.  Under-reliance on the decision aid: A difference in calibration and attribution between self and aid , 2006 .

[13]  Robert Sparrow Predators or plowshares? arms control of robotic weapons , 2009, IEEE Technology and Society Magazine.

[14]  Tony Morgan,et al.  Competence and responsibility in intelligent systems , 1992, Artificial Intelligence Review.

[15]  David D. Woods,et al.  Systems with Human Monitors: A Signal Detection Analysis , 1985, Hum. Comput. Interact..

[16]  Mark Coeckelbergh Moral Responsibility, Technology, and Experiences of the Tragic: From Kierkegaard to Offshore Engineering , 2012, Sci. Eng. Ethics.

[17]  Andreas Matthias,et al.  The responsibility gap: Ascribing responsibility for the actions of learning automata , 2004, Ethics and Information Technology.

[18]  Joachim Meyer,et al.  Effects of Warning Validity and Proximity on Responses to Warnings , 2001, Hum. Factors.

[19]  D. Gilbert,et al.  The correspondence bias. , 1995, Psychological bulletin.

[20]  Joachim Meyer,et al.  Use of Warnings in an Attentionally Demanding Detection Task , 2001, Hum. Factors.

[21]  Franco Cicirelli,et al.  On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment , 2016, Journal of Medical Systems.

[22]  Salience and the Cognitive Mediation of Attribution , 1979 .

[23]  H. K. Davison,et al.  How Theory X style of management arose from a fundamental attribution error , 2015 .

[24]  Nicole A. Vincent A Structured Taxonomy of Responsibility Concepts , 2010 .

[25]  M. C. Elish,et al.  Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction , 2019, Engaging Science, Technology, and Society.

[26]  Rebecca Crootof,et al.  The Killer Robots Are Here: Legal and Policy Implications , 2014 .

[27]  Joachim Meyer,et al.  The Responsibility Quantification Model of Human Interaction With Automation , 2018, IEEE Transactions on Automation Science and Engineering.

[28]  Myung-Soo Lee,et al.  “Who Done It?” Attributions by Entrepreneurs and Experts of the Factors that Cause and Impede Small Business Success , 2004 .

[29]  Maarten Sierhuis,et al.  Coactive design , 2014, HRI 2014.

[30]  Mark Mulder,et al.  A Topology of Shared Control Systems—Finding Common Ground in Diversity , 2018, IEEE Transactions on Human-Machine Systems.

[31]  Neville Moray Monitoring, complacency, scepticism and eutactic behaviour , 2003 .

[32]  Gys Albertus Marthinus Meiring,et al.  A Review of Intelligent Driving Style Analysis Systems and Related Artificial Intelligence Algorithms , 2015, Sensors.

[33]  Thomas M. Powers,et al.  Computer Systems and Responsibility: A Normative Look at Technological Complexity , 2005, Ethics and Information Technology.

[34]  Wynne W. Chin,et al.  Impact of sales force automation on technology-related stress, effort, and technology usage among salespeople , 2005 .

[35]  Robert D. Sorkin,et al.  FORUM: Why are people turning off our alarms? , 1988 .

[36]  A. Ramli,et al.  Computer-aided detection/diagnosis of breast cancer in mammography and ultrasound: a review. , 2013, Clinical imaging.

[37]  Davide Castelvecchi,et al.  Can we open the black box of AI? , 2016, Nature.

[38]  Joachim Meyer,et al.  Trust, Reliance, and Compliance , 2013 .

[39]  B. Karsh,et al.  Occupational stress in human computer interaction. , 1999, Industrial health.

[40]  L. Ross From the Fundamental Attribution Error to the Truly Fundamental Attribution Error and Beyond: My Research Journey , 2018, Perspectives on psychological science : a journal of the Association for Psychological Science.

[41]  L. Ross The Intuitive Psychologist And His Shortcomings: Distortions in the Attribution Process1 , 1977 .