Recommended roles for uninhabited team members within mixed-initiative combat teams

Trust in automation is a well-researched topic that is particularly important when planning mixed initiative interaction. When working with teams comprised of both human and non-human team members, the amount of trust the operator places in the automation often determines which parts of the interaction can be automated and the optimal level of automation. The mixed-initiative community has created numerous systems that leverage trust in automation, but results have been inconclusive. After examining the primary factors that impact trust in automated systems, we make several recommendations regarding the assignment of roles for human and non-human mixed-initiative team members.

[1]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[2]  Neville Moray,et al.  Trust and human intervention in automated systems , 1995 .

[3]  S Lewandowsky,et al.  The dynamics of trust: comparing humans to automation. , 2000, Journal of experimental psychology. Applied.

[4]  Jeffrey M. Bradshaw,et al.  Toward Trustworthy Adjustable Autonomy and Mixed-Initiative Interaction in KAoS , 2004 .

[5]  Bonnie M. Muir,et al.  Trust in automation. I: Theoretical issues in the study of trust and human intervention in automated systems , 1994 .

[6]  Cees J. H. Midden,et al.  The effects of errors on system trust, self-confidence, and the allocation of control in route planning , 2003, Int. J. Hum. Comput. Stud..

[7]  Jonathan Stevens,et al.  The impact of unmanned weapon systems on individual and team performance , 2010, SpringSim.

[8]  Ewart de Visser,et al.  Measurement of trust in human-robot collaboration , 2007, 2007 International Symposium on Collaborative Technologies and Systems.

[9]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[10]  Regina A. Pomranky,et al.  The role of trust in automation reliance , 2003, Int. J. Hum. Comput. Stud..

[11]  Vera Zaychik Moffitt,et al.  Mixed-Initiative Adjustable Autonomy for Human / Unmanned System Teaming , 2008 .

[12]  Janet E. Miller,et al.  Eliciting Expectations to Develop Trust in Systems , 2008 .

[13]  M R Endsley,et al.  Level of automation effects on performance, situation awareness and workload in a dynamic control task. , 1999, Ergonomics.

[14]  Jeffrey M. Bradshaw,et al.  Toward Trustworthy Adjustable Autonomy in KAoS , 2004, Trusting Agents for Trusting Electronic Societies.

[15]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[16]  Christopher A. Miller,et al.  Trust in Adaptive Automation : The Role of Etiquette in Tuning Trust via Analogic and Affective Methods , 2005 .

[17]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[18]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[19]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[20]  Victor A. Riley,et al.  Operator reliance on automation: Theory and data. , 1996 .

[21]  Thomas B. Sheridan,et al.  Human and Computer Control of Undersea Teleoperators , 1978 .

[22]  Sara B. Kiesler,et al.  Cooperation with a robotic assistant , 2002, CHI Extended Abstracts.

[23]  Raja Parasuraman,et al.  Adaptive Automation for Human-Robot Teaming in Future Command and Control Systems , 2007 .

[24]  Marti A. Hearst Mixed-initiative interaction , 1999 .