Effects of personality traits on user trust in human–machine collaborations

Data analytics-driven solutions are widely used in various intelligent systems, where humans and machines make decisions collaboratively based on predictions. Human factors such as personality and trust have significant effects on such human–machine collaborations. This paper investigates effects of personality traits on user trust in human–machine collaborations under uncertainty and cognitive load conditions. A user study of 42 subjects in a repeated factorial design experiment found that uncertainty presentation led to increased trust but only under low cognitive load conditions when users had sufficient cognitive resources to process the information. Presentation of uncertainty under high load conditions led to a decrease in trust. When further drilling down into personality trait groups of users, overall, users with low Openness showed the highest trust. Furthermore, under the low cognitive load condition, it was found that the trust was enhanced under ambiguity uncertainty with low Agreeableness, low Neuroticism, high Extraversion, high Conscientiousness, and high Openness. Under the high cognitive load condition, high Neuroticism and low Extraversion benefitted the trust without the uncertainty presentation. The results demonstrated that different personality traits affected trust differently under uncertainty and cognitive load conditions. A framework of user trust feedback loop was set up to incorporate the study results into human–machine collaborations for the meaningful participatory design.

[1]  Richard J. Hanowski,et al.  Driver Acceptance of Unreliable Traffic Information in Familiar and Unfamiliar Settings , 1997, Hum. Factors.

[2]  Thomas Donaldson,et al.  The Ethical Wealth of Nations , 2001 .

[3]  Weidong Huang,et al.  Sharing Emotion by Displaying a Partner Near the Gaze Point in a Telepresence System , 2019, 2019 23rd International Conference in Information Visualization – Part II.

[4]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[5]  Satosi Watanabe,et al.  Pattern Recognition: Human and Mechanical , 1985 .

[6]  Zhi Chen,et al.  Modeling lane-change-related crashes with lane-specific real-time traffic and weather data , 2018, J. Intell. Transp. Syst..

[7]  Yang Wang,et al.  Making machine learning useable by revealing internal states update - a transparent approach , 2016, Int. J. Comput. Sci. Eng..

[8]  Susan Joslyn,et al.  The Cry Wolf Effect and Weather‐Related Decision Making , 2015, Risk analysis : an official publication of the Society for Risk Analysis.

[9]  James L. Szalma,et al.  A Meta-Analysis of Factors Influencing the Development of Trust in Automation , 2016, Hum. Factors.

[10]  Jin-Hee Cho,et al.  Effect of personality traits on trust and risk to phishing vulnerability: Modeling and analysis , 2016, 2016 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA).

[11]  Nasir D. Memon,et al.  Phishing, Personality Traits and Facebook , 2013, ArXiv.

[12]  George Mason Situation Awareness, Mental Workload, and Trust in Automation:Viable, Empirically Supported Cognitive Engineering Constructs , 2011 .

[13]  Mary Czerwinski,et al.  Interactions with big data analytics , 2012, INTR.

[14]  J. Skinner,et al.  Organizational Behaviour in Sport , 2017 .

[15]  Daniel L. Oberski,et al.  Personality and Political Participation: The Mediation Hypothesis , 2012 .

[16]  M. Schweitzer,et al.  Who Is Trustworthy? Predicting Trustworthy Intentions and Behavior , 2017, Journal of personality and social psychology.

[17]  C. Wickens,et al.  Situation Awareness, Mental Workload, and Trust in Automation: Viable, Empirically Supported Cognitive Engineering Constructs , 2008 .

[18]  Mark Vollrath,et al.  Improving the Driver–Automation Interaction , 2013, Hum. Factors.

[19]  A. Gerber,et al.  Personality and the Strength and Direction of Partisan Identification , 2012 .

[20]  Woontack Woo,et al.  Evaluating the Combination of Visual Communication Cues for HMD-based Mixed Reality Remote Collaboration , 2019, CHI.

[21]  David P. Biros,et al.  The Influence of Task Load and Automation Trust on Deception Detection , 2004 .

[22]  K. Sycara,et al.  Relation between Trust Attitudes Toward Automation, Hofstede’s Cultural Dimensions, and Big Five Personality Traits , 2016 .

[23]  R. E. Christal,et al.  Recurrent personality factors based on trait ratings. , 1992, Journal of personality.

[24]  Jerry Alan Fails,et al.  Interactive machine learning , 2003, IUI '03.

[25]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[26]  Seungwon Kim,et al.  Comparing pointing and drawing for remote collaboration , 2013, ISMAR.

[27]  Bernd Irlenbusch,et al.  Identifying personality traits to enhance trust between organisations: an experimental approach , 2008 .

[28]  Sibel Adali,et al.  Predicting personality with social behavior: a comparative study , 2014, Social Network Analysis and Mining.

[29]  R. Fiebrink,et al.  End-User Machine Learning in Music Composition and Performance , 2012 .

[30]  Yang Wang,et al.  Robust Multimodal Cognitive Load Measurement , 2016, Human–Computer Interaction Series.

[31]  P. T. Dinesen,et al.  The Civic Personality: Personality and Democratic Citizenship , 2014 .

[32]  Evan M. Gordon,et al.  Neural Signatures of Economic Preferences for Risk and Ambiguity , 2006, Neuron.

[33]  Robert L Winkler,et al.  The Importance of Communicating Uncertainties in Forecasts: Overestimating the Risks from Winter Storm Juno , 2015, Risk analysis : an official publication of the Society for Risk Analysis.

[34]  WaldoJim,et al.  The Effects of Mixing Machine Learning and Human Judgment , 2019 .

[35]  Heloisa Candello,et al.  User Methods and Approaches to Design Cognitive Systems , 2016, HCI.

[36]  Cary Deck,et al.  The Effect of Cognitive Load on Economic Decision Making , 2015 .

[37]  A. Chouldechova,et al.  Toward Algorithmic Accountability in Public Services: A Qualitative Study of Affected Community Perspectives on Algorithmic Decision-making in Child Welfare Services , 2019, CHI.

[38]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[39]  Aaron Marcus,et al.  Design, User Experience, and Usability: Design Thinking and Methods , 2016, Lecture Notes in Computer Science.

[40]  Henry Adobor Optimal trust? Uncertainty as a determinant and limit to trust in inter‐firm alliances , 2006 .

[41]  Paul C. Bauer,et al.  Personality traits and the propensity to trust friends and strangers , 2016 .

[42]  Fang Chen,et al.  Making machine learning useable , 2015, Int. J. Intell. Syst. Technol. Appl..

[43]  Ruth Sims,et al.  Empirical investigation of the impact of using co-design methods when generating proposals for sustainable travel solutions , 2016 .

[44]  Göran Falkman,et al.  Presenting system uncertainty in automotive UIs for supporting trust calibration in autonomous driving , 2013, AutomotiveUI.

[45]  Anand K. Gramopadhye,et al.  Measurement of trust in complex and dynamic systems using a quantitative approach , 2004 .

[46]  Jerry B. Weinberg,et al.  Participatory design in a human-computer interaction course: teaching ethnography methods to computer scientists , 2002, SIGCSE '02.

[47]  Fang Chen,et al.  Trust and cognitive load in the text-chat environment: the role of mouse movement , 2014, OZCHI.

[48]  Teresa Scantamburlo,et al.  Machine learning in decisional process , 2016, SIGCAS Comput. Soc..

[49]  Susan Bell Trickett,et al.  Visualizing Uncertainty , 2014, Hum. Factors.

[50]  Katia P. Sycara,et al.  The Effect of Culture on Trust in Automation , 2018, ACM Trans. Interact. Intell. Syst..

[51]  Alexander Kunze,et al.  Automation transparency: implications of uncertainty communication for human-automation interaction and interfaces , 2019, Ergonomics.

[52]  Peter A. Hancock,et al.  Can You Trust Your Robot? , 2011 .

[53]  Paul T. Costa,et al.  Personality in Adulthood: A Five-Factor Theory Perspective , 2005 .

[54]  Kiri Wagstaff,et al.  Machine Learning that Matters , 2012, ICML.

[55]  Jim Waldo,et al.  The Effects of Mixing Machine Learning and Human Judgment , 2019, ACM Queue.

[56]  Erik Brynjolfsson,et al.  The rise of data-driven decision-making is real but uneven , 2017, IEEE Engineering Management Review.

[57]  Michael J. Muller,et al.  Participatory design: the third space in HCI , 2002 .