Low-level Automation as a Pathway to Appropriate Trust in an Intelligent PED Enterprise: Design of a Collaborative Work Environment

In Military Intelligence, Processing, Exploitation, and Dissemination (PED) functions are critical to success, providing capabilities that support intelligence request lifecycles. PED capabilities are becoming increasingly available to smaller, more centralized teams that support multiple battlespace operators. Lessons learned reveal that while existing PED processes provide detailed intelligence, operators are burdened with continual coordination, communication, and interpretation tasks that, coupled with the data volume generated by multi-INT ISR platforms, cause breakdowns in coordination and communication. These breakdowns result in failures to share, attend to, and interpret information critical to mission success. Key contributors to these breakdowns are the lack of automated support and cognitive incongruence between existing automated solutions and the support required by analysts, which reduce or destroy trust. Developers must recognize that system interactions that establish a human-machine dialogue are necessary to develop the trust required for automation’s successful adoption and employment. To overcome these challenges, we present a Collaborative Work Environment for PED operations. It includes decisioncentered analytics automation and low-level task automation designed to establish the human-machine dialogue that engenders trust. This approach facilitated appropriate attitudes of trust, leading analysts to request that we develop and deploy higher-level, planned automation capabilities to further offload PED tasks.

[1]  George R. S. Weir Meaningful interaction in complex man−machine systems , 1992 .

[2]  J. Llinas,et al.  Studies and Analyses of Aided Adversarial Decision Making. Phase 2: Research on Human Trust in Automation , 1998 .

[3]  Bernard Barber,et al.  The Logic and Limits of Trust , 1983 .

[4]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[5]  Jeffrey M. Bradshaw,et al.  Trust in Automation , 2013, IEEE Intelligent Systems.

[6]  Ann M. Bisantz,et al.  Assessment of operator trust in and utilization of automated decision-aids under different framing conditions , 2001 .

[7]  Michael J. Prietula,et al.  The Turing effect: the nature of trust in expert systems advice , 1997 .

[8]  Randall D. Spain,et al.  The Role of Automation Etiquette and Pedigree in Trust and Dependence , 2009 .

[9]  James W. Danaher,et al.  Human Error in ATC System Operations , 1980 .

[10]  C. A. Mangio,et al.  Intelligence Analysis: Once Again , 2010 .

[11]  Ji Gao,et al.  Extending the decision field theory to model operators' reliance on automation in supervisory control situations , 2006, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[12]  J. G. Holmes,et al.  Trust in close relationships. , 1985 .

[13]  Colin G. Drury,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[14]  Emilie M. Roth,et al.  Using Cognitive Task Analysis (CTA) to Seed Design Concepts for Intelligence Analysts Under Data Overload , 2001 .

[15]  I. Ajzen,et al.  Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research , 1977 .

[16]  Morton Deutschi The Effect of Motivational Orientation upon Trust and Suspicion , 1960 .

[17]  Arthur D. Fisk,et al.  Modeling and analysis of a dynamic judgment task using a lens model approach , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[18]  B. J. Fogg,et al.  Credibility and computing technology , 1999, CACM.

[19]  James B. Kraiman,et al.  Modeling and simulation of the ISR tasking, processing, exploitation, and dissemination (TPED) process , 2000, Defense, Security, and Sensing.

[20]  E. Hollnagel FRAM: The Functional Resonance Analysis Method: Modelling Complex Socio-technical Systems , 2012 .

[21]  Robyn Hopcroft,et al.  Work Domain Analysis: Theoretical Concepts and Methodology , 2005 .

[22]  Wei Liu,et al.  Trustworthiness of Information Sources and Information Pedigrees , 2001, ATAL.

[23]  Carl W. Turner,et al.  Validation of a Two Factor Structure for System Trust , 2005 .

[24]  Catholijn M. Jonker,et al.  Formal Analysis of Models for the Dynamics of Trust Based on Experiences , 1999, MAAMAW.

[25]  Pamela Briggs,et al.  Modelling self-confidence in users of a computer-based system showing unrepresentative design , 1998, Int. J. Hum. Comput. Stud..

[26]  Rino Falcone,et al.  Trust dynamics: how trust is influenced by direct experiences and by trust itself , 2004, Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, 2004. AAMAS 2004..

[27]  Catholijn M. Jonker,et al.  Human Experiments in Trust Dynamics , 2004, iTrust.

[28]  P. Freebody,et al.  Generalized multivariate lens model analysis for complex human inference tasks , 1985 .

[29]  Regina A. Pomranky,et al.  The role of trust in automation reliance , 2003, Int. J. Hum. Comput. Stud..

[30]  R Parasuraman,et al.  Designing automation for human use: empirical studies and quantitative models , 2000, Ergonomics.

[31]  J. H. Davis,et al.  An integrative model of organizational trust, Academy of Management Review, : . , 1995 .

[32]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[33]  James L. Alty,et al.  Knowledge-based dialogue for dynamic systems , 1989, Autom..

[34]  Yvonne Rogers,et al.  Interaction Design: Beyond Human-Computer Interaction , 2002 .

[35]  P. Goillau,et al.  Guidelines for Trust in Future ATM Systems: Measures , 2003 .

[36]  Douglas A. Wiegmann,et al.  Effects of Information Source, Pedigree, and Reliability on Operator Interaction With Decision Support Systems , 2007, Hum. Factors.

[37]  B. J. Fogg,et al.  The elements of computer credibility , 1999, CHI '99.

[38]  Laurence W. Grimes,et al.  Measurement of trust over time in hybrid inspection systems , 2005 .

[39]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[40]  George R. S. Weir Dialogue expertise in man-machine systems , 1989 .

[41]  Bernt Schiele,et al.  Towards improving trust in context-aware systems by displaying system confidence , 2005, Mobile HCI.

[42]  I. Ajzen,et al.  Understanding Attitudes and Predicting Social Behavior , 1980 .

[43]  Roderick M. Kramer,et al.  Trust and distrust in organizations: emerging perspectives, enduring questions. , 1999, Annual review of psychology.

[44]  Jean-Marc Robert,et al.  Trust in new decision aid systems , 2006, IHM '06.

[45]  Mica R. Endsley,et al.  Automation and situation awareness. , 1996 .

[46]  J. Rotter A new scale for the measurement of interpersonal trust. , 1967, Journal of personality.

[47]  Jinwoo Kim,et al.  Designing towards emotional usability in customer interfaces trustworthiness of cyber-banking system interfaces , 1998, Interact. Comput..

[48]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[49]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[50]  Earl L. Wiener Cockpit Automation: In Need of a Philosophy , 1985 .

[51]  Melissa L. Finucane,et al.  Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality , 2004, Risk analysis : an official publication of the Society for Risk Analysis.

[52]  G. Zaltman,et al.  Factors affecting trust in market research relationships. , 1993 .

[53]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[54]  J. H. Davis,et al.  An Integrative Model Of Organizational Trust , 1995 .

[55]  Genshe Chen,et al.  Pedigree Information for Enhanced Situation and Threat Assessment , 2006, 2006 9th International Conference on Information Fusion.

[56]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[57]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[58]  J. Johns A concept analysis of trust. , 1996, Journal of advanced nursing.

[59]  George R. S. Weir Varieties of dialogue in man-machine systems , 1989 .

[60]  Scott M. Galster,et al.  Effects of Reliable and Unreliable Automation on Subjective Measures of Mental Workload, Situation Awareness, Trust and Confidence in a Dynamic Flight Task , 2004 .