Automation-Induced Complacency Potential: Development and Validation of a New Scale
暂无分享,去创建一个
W. J. Bryant | Stephanie M. Merritt | Alicia Ako-Brew | Amy Staley | Michael McKenna | Austin Leone | Lei Shirase
[1] L. Tucker,et al. A reliability coefficient for maximum likelihood factor analysis , 1973 .
[2] Thomas B. Sheridan,et al. Human and Computer Control of Undersea Teleoperators , 1978 .
[3] J. H. Steiger. Structural Model Evaluation and Modification: An Interval Estimation Approach. , 1990, Multivariate behavioral research.
[4] Raja Parasuraman,et al. Automation- Induced "Complacency": Development of the Complacency-Potential Rating Scale , 1993 .
[5] Rapid Thermal Multiprocessor,et al. Supervisory Control of a , 1993 .
[6] Raja Parasuraman,et al. Performance Consequences of Automation-Induced 'Complacency' , 1993 .
[7] D. Watson,et al. Constructing validity: Basic issues in objective scale development , 1995 .
[8] Raja Parasuraman,et al. Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.
[9] Raja Parasuraman,et al. Automation-induced monitoring inefficiency: role of display location , 1997, Int. J. Hum. Comput. Stud..
[10] S. M. Casey,et al. Set Phasers on Stun: And Other True Tales of Design, Technology, and Human Error , 1998 .
[11] Marika Hoedemaeker,et al. Driver behavior in an emergency situation in the Automated Highway System , 1999 .
[12] Jennifer Wilson,et al. Flight Deck Automation issues , 1999 .
[13] P. Bentler,et al. Cutoff criteria for fit indexes in covariance structure analysis : Conventional criteria versus new alternatives , 1999 .
[14] Christopher D. Wickens,et al. A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.
[15] R. Vandenberg,et al. A Review and Synthesis of the Measurement Invariance Literature: Suggestions, Practices, and Recommendations for Organizational Research , 2000 .
[16] Toshiyuki Inagaki,et al. Attention and complacency , 2000 .
[17] Lawrence J. Prinzel,et al. Examination of Automation-Induced Complacency and Individual Difference Variates , 2001 .
[18] Linda G. Pierce,et al. The Perceived Utility of Human and Automated Aids in a Visual Detection Task , 2002, Hum. Factors.
[19] Neville Moray. Monitoring, complacency, scepticism and eutactic behaviour , 2003 .
[20] G. Jamieson,et al. CONSIDERING SUBJECTIVE TRUST AND MONITORING BEHAVIOR IN ASSESSING AUTOMATION-INDUCED “COMPLACENCY” , 2004 .
[21] Greg A. Jamieson,et al. The impact of context-related reliability on automation failure detection and scanning behaviour , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).
[22] Frank J. Lee,et al. Toward an ACT-R General Executive for Human Multitasking , 2004, ICCM.
[23] Raja Parasuraman,et al. Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload , 2005, Hum. Factors.
[24] Jason W. Osborne,et al. Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. , 2005 .
[25] Mark W. Scerbo,et al. Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust , 2007 .
[26] John D. Lee,et al. Review of a Pivotal Human Factors Article: “Humans and Automation: Use, Misuse, Disuse, Abuse” , 2008, Hum. Factors.
[27] Dietrich Manzey,et al. Misuse of automated decision aids: Complacency, automation bias and the impact of training experience , 2008, Int. J. Hum. Comput. Stud..
[28] Daniel R. Ilgen,et al. Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.
[29] Mark S. Young,et al. Cooperation between drivers and automation: implications for safety , 2009 .
[30] 劉 健勤,et al. 複雑系に見たDefault Mode Network , 2009, ICS 2009.
[31] Elizabeth M. Poposki,et al. The Multitasking Preference Inventory: Toward an Improved Measure of Individual Differences in Polychronicity , 2010 .
[32] J. Klomp,et al. A review and synthesis , 2010 .
[33] R. Dalal,et al. A Review and Synthesis of Situational Strength in the Organizational Sciences , 2010 .
[34] Raja Parasuraman,et al. Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.
[35] Stephanie M. Merritt. Affective Processes in Human–Automation Interactions , 2011, Hum. Factors.
[36] Torsten Rohlfing,et al. Cerebral blood flow in posterior cortical nodes of the default mode network decreases with task engagement but remains higher than in most brain regions. , 2011, Cerebral cortex.
[37] Dietrich Grasshoff,et al. The effect of complacency potential on human operators´monitoring behaviour in aviation , 2011 .
[38] Juan R. Vidal,et al. Transient Suppression of Broadband Gamma Power in the Default-Mode Network Is Correlated with Task Complexity and Subject Performance , 2011, The Journal of Neuroscience.
[39] Timothy E. Ham,et al. Default Mode Network Connectivity Predicts Sustained Attention Deficits after Traumatic Brain Injury , 2011, The Journal of Neuroscience.
[40] Huabin Tang,et al. Phase-2 evaluation of a Tactical conflict detection tool in the Terminal area , 2012, 2012 IEEE/AIAA 31st Digital Avionics Systems Conference (DASC).
[41] Yves Rosseel,et al. lavaan: An R Package for Structural Equation Modeling , 2012 .
[42] Elizabeth M. Poposki,et al. Detecting and Deterring Insufficient Effort Responding to Surveys , 2012 .
[43] Deborah Lee,et al. I Trust It, but I Don’t Know Why , 2013, Hum. Factors.
[44] Savita Verma,et al. Human Factors Evaluation of Conflict Detection Tool for Terminal Area , 2013 .
[45] Sonja Giesemann,et al. Automation Effects in Train Driving with Train Protection Systems: Assessing Person- and Task-related Factors. , 2013 .
[46] R Core Team,et al. R: A language and environment for statistical computing. , 2014 .
[47] R. Pak,et al. The Effects of Age and Working Memory Demands on Automation-Induced Complacency , 2014 .
[48] Kim-Phuong L. Vu,et al. Air Traffic Controller Trust in Automation in NextGen , 2015 .
[49] Francis T. Durso,et al. Individual Differences in the Calibration of Trust in Automation , 2015, Hum. Factors.
[50] D. Besner,et al. A Resource-Control Account of Sustained Attention , 2015, Perspectives on psychological science : a journal of the Association for Psychological Science.
[51] Deborah Lee,et al. Measuring Individual Differences in the Perfect Automation Schema , 2015, Hum. Factors.
[52] J. Schooler,et al. Vigilance impossible: Diligence, distraction, and daydreaming all lead to failures in a practical monitoring task , 2015, Consciousness and Cognition.
[53] James P. Bliss,et al. Does Accountability and an Automation Decision Aid’s Reliability Affect Human Performance in a Visual Search Task? , 2017 .
[54] Ruth A. Baer,et al. Mindfulness , 2017, SAGE Research Methods Foundations.
[55] Francesca C. Fortenbaugh,et al. Recent theoretical, neural, and clinical advances in sustained attention research , 2017, Annals of the New York Academy of Sciences.