Low-Level Automation as a Pathway to Appropriate Trust in the PED Enterprise : Design of a Collaborative Work Environment

In Military Intelligence, Processing, Exploitation, and Dissemination (PED) functions are critical to success, providing capabilities that support intelligence request lifecycles. PED capabilities are becoming increasingly available to smaller, more centralized teams that support multiple battlespace operators. Lessons learned reveal that while existing PED processes provide detailed intelligence, operators are burdened with continual coordination, communication, and interpretation tasks that, coupled with the data volume generated by multi-INT ISR platforms, cause breakdowns in coordination and communication. These breakdowns result in failures to share, attend to, and interpret information critical to mission success. Key contributors to these breakdowns are the lack of automated support and cognitive incongruence between existing automated solutions and the support required by analysts, which reduce or destroy trust. Developers must recognize that system interactions that establish a human-machine dialogue are necessary to develop the trust required for automation’s successful adoption and employment. To overcome these challenges, we present a Collaborative Work Environment for PED operations. It includes decisioncentered analytics automation and low-level task automation designed to establish the human-machine dialogue that engenders trust. This approach facilitated appropriate attitudes of trust, leading analysts to request that we develop and deploy higher-level, planned automation capabilities to further offload PED tasks.

[1]  Melissa L. Finucane,et al.  Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality , 2004, Risk analysis : an official publication of the Society for Risk Analysis.

[2]  Emilie M. Roth,et al.  Using Cognitive Task Analysis (CTA) to Seed Design Concepts for Intelligence Analysts Under Data Overload , 2001 .

[3]  Scott M. Galster,et al.  Effects of Reliable and Unreliable Automation on Subjective Measures of Mental Workload, Situation Awareness, Trust and Confidence in a Dynamic Flight Task , 2004 .

[4]  Randall D. Spain,et al.  The Role of Automation Etiquette and Pedigree in Trust and Dependence , 2009 .

[5]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[6]  George R. S. Weir Dialogue expertise in man-machine systems , 1989 .

[7]  Regina A. Pomranky,et al.  The role of trust in automation reliance , 2003, Int. J. Hum. Comput. Stud..

[8]  Colin G. Drury,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[9]  G. Zaltman,et al.  Factors affecting trust in market research relationships. , 1993 .

[10]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[11]  D. H. Mills The Logic and Limits of Trust , 1983 .

[12]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[13]  J. Llinas,et al.  Studies and Analyses of Aided Adversarial Decision Making. Phase 2: Research on Human Trust in Automation , 1998 .

[14]  Michael J. Prietula,et al.  The Turing effect: the nature of trust in expert systems advice , 1997 .

[15]  Douglas A. Wiegmann,et al.  Effects of Information Source, Pedigree, and Reliability on Operator Interaction With Decision Support Systems , 2007, Hum. Factors.

[16]  Bernt Schiele,et al.  Towards improving trust in context-aware systems by displaying system confidence , 2005, Mobile HCI.

[17]  J. Rotter A new scale for the measurement of interpersonal trust. , 1967, Journal of personality.

[18]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[19]  George R. S. Weir Varieties of dialogue in man-machine systems , 1989 .

[20]  I. Ajzen,et al.  Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research , 1977 .

[21]  Robyn Hopcroft,et al.  Work Domain Analysis: Theoretical Concepts and Methodology , 2005 .

[22]  Pamela Briggs,et al.  Modelling self-confidence in users of a computer-based system showing unrepresentative design , 1998, Int. J. Hum. Comput. Stud..

[23]  James W. Danaher,et al.  Human Error in ATC System Operations , 1980 .

[24]  Genshe Chen,et al.  Pedigree Information for Enhanced Situation and Threat Assessment , 2006, 2006 9th International Conference on Information Fusion.

[25]  J. Johns A concept analysis of trust. , 1996, Journal of advanced nursing.

[26]  Laurence W. Grimes,et al.  Measurement of trust over time in hybrid inspection systems , 2005 .

[27]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[28]  Jean-Marc Robert,et al.  Trust in new decision aid systems , 2006, IHM '06.

[29]  Catholijn M. Jonker,et al.  Human Experiments in Trust Dynamics , 2004, iTrust.

[30]  B. J. Fogg,et al.  The elements of computer credibility , 1999, CHI '99.

[31]  Wei Liu,et al.  Trustworthiness of Information Sources and Information Pedigrees , 2001, ATAL.

[32]  P. Goillau,et al.  Guidelines for Trust in Future ATM Systems: Measures , 2003 .

[33]  R Parasuraman,et al.  Designing automation for human use: empirical studies and quantitative models , 2000, Ergonomics.

[34]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[35]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[36]  George R. S. Weir Meaningful interaction in complex man−machine systems , 1992 .

[37]  Jinwoo Kim,et al.  Designing towards emotional usability in customer interfaces trustworthiness of cyber-banking system interfaces , 1998, Interact. Comput..

[38]  Charles D. Barrett Understanding Attitudes and Predicting Social Behavior , 1980 .

[39]  S. Gregor,et al.  Measuring Human-Computer Trust , 2000 .

[40]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[41]  Younho Seong,et al.  Assessment of Operator Trust in and Utilization of Automated Decision Aids under Different Framing Conditions , 2000 .

[42]  Gunnar Johannsen,et al.  Knowledge Based Dialogue for Dynamic Systems , 1987 .

[43]  M. Endsley Automation and situation awareness. , 1996 .

[44]  Roderick M. Kramer,et al.  Trust and distrust in organizations: emerging perspectives, enduring questions. , 1999, Annual review of psychology.

[45]  B. J. Fogg,et al.  Credibility and computing technology , 1999, CACM.

[46]  Morton Deutschi The Effect of Motivational Orientation upon Trust and Suspicion , 1960 .

[47]  P. Freebody,et al.  Generalized multivariate lens model analysis for complex human inference tasks , 1985 .

[48]  Catholijn M. Jonker,et al.  Formal Analysis of Models for the Dynamics of Trust Based on Experiences , 1999, MAAMAW.

[49]  J. G. Holmes,et al.  Trust in close relationships. , 1985 .

[50]  Arthur D. Fisk,et al.  Modeling and analysis of a dynamic judgment task using a lens model approach , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[51]  Rino Falcone,et al.  Trust dynamics: how trust is influenced by direct experiences and by trust itself , 2004, Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, 2004. AAMAS 2004..

[52]  Christopher A. Miller,et al.  Trust in Adaptive Automation : The Role of Etiquette in Tuning Trust via Analogic and Affective Methods , 2005 .

[53]  Ji Gao,et al.  Extending the decision field theory to model operators' reliance on automation in supervisory control situations , 2006, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[54]  Carl W. Turner,et al.  Validation of a Two Factor Structure for System Trust , 2005 .