Rage against the machine: Automation in the moral domain

Abstract The introduction of ever more capable autonomous systems is moving at a rapid pace. The technological progress will enable us to completely delegate to machines processes that were once a prerogative for humans. Progress in fields like autonomous driving promises huge benefits on both economical and ethical scales. Yet, there is little research that investigates the utilization of machines to perform tasks that are in the moral domain. This study explores whether subjects are willing to delegate tasks that affect third parties to machines as well as how this decision is evaluated by an impartial observer. We examined two possible factors that might coin attitudes regarding machine use—perceived utility of and trust in the automated device. We found that people are hesitant to delegate to a machine and that observers judge such delegations in relatively critical light. Neither perceived utility nor trust, however, can account for this pattern. Alternative explanations that we test in a post-experimental survey also do not find support. We may thus observe an aversion per se against machine use in the moral domain.

[1]  Raja Parasuraman,et al.  The World is not Enough: Trust in Cognitive Agents , 2012 .

[2]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[3]  M. Ross,et al.  Egocentric Biases in Availability and Attribution , 1979 .

[4]  Noah J. Goodall,et al.  Ethical Decision Making during Automated Vehicle Crashes , 2014, ArXiv.

[5]  Roland Siegwart,et al.  What do people expect from robots? , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Julian Nida-Rümelin,et al.  Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis , 2014, Science and Engineering Ethics.

[7]  H. Krcmar,et al.  Anticipating acceptance of emerging technologies using twitter: the case of self-driving cars , 2018, Journal of Business Economics.

[8]  Fred D. Davis Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology , 1989, MIS Q..

[9]  Ben Greiner,et al.  The Online Recruitment System ORSEE 2.0 - A Guide for the Organization of Experiments in Economics , 2004 .

[10]  B. Greiner,et al.  Indirect Reciprocity in Cyclical Networks - An Experimental Study - , 2005 .

[11]  Linda G. Pierce,et al.  The Perceived Utility of Human and Automated Aids in a Visual Detection Task , 2002, Hum. Factors.

[12]  Jan Gogoll,et al.  Autonomous Cars: In Favor of a Mandatory Ethics Setting , 2016, Science and Engineering Ethics.

[13]  Colin Camerer,et al.  Not So Different After All: A Cross-Discipline View Of Trust , 1998 .

[14]  C. Stangor,et al.  Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures. , 1992 .

[15]  Daniel J. Fagnant,et al.  Preparing a Nation for Autonomous Vehicles: Opportunities, Barriers and Policy Recommendations , 2015 .

[16]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[17]  O. Svenson ARE WE ALL LESS RISKY AND MORE SKILLFUL THAN OUR FELLOW DRIVERS , 1981 .

[18]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[19]  Charles Stangor,et al.  Strength of expectancies and memory for social information: What we remember depends on how much we know ☆ , 1989 .

[20]  U. Fischbacher z-Tree: Zurich toolbox for ready-made economic experiments , 1999 .