Parts and Wholes: Scenarios and Simulators for Human Performance Studies

As tools like full-scale simulators and microworlds become more readily available to researchers, a fundamental question remains the extent to which full scenarios and simulators are necessary for valid and generalizable results. In this paper, we explore the continuum of scenarios and simulators and evaluate the advantages and disadvantages of each for human performance studies. The types of scenarios presented to participants may range from microtasks to complex multi-step scenarios. Microtasks usually involve only brief exposure to the human-system interface but may thereby facilitate ready data collection through repeated trials. In contrast, full scenarios present a sequence of actions that may require an extended period of time. The tradeoffs center on the fidelity of the situations and the requirements for the type of human performance data to be collected. The type of simulator presented to participants may range from a part-task simulator, to a simplified microworld, or to a full-scope high-fidelity simulator. The simplified simulators present greater opportunity for control but lose much of the context of real-world use found in full-scope simulators. We frame scenarios and simulators in the context of micro- vs. macro-cognition and provide examples of how the different experimental design choices lend themselves to different types of studies.

[1]  Ronald L. Boring,et al.  Early-Stage Design and Evaluation for Nuclear Power Plant Control Room Upgrades , 2014 .

[2]  Ronald L. Boring,et al.  Extrapolating Nuclear Process Control Microworld Simulation Performance Data from Novices to Experts - A Preliminary Analysis , 2018, Advances in Human Error, Reliability, Resilience, and Performance.

[3]  Ronald L. Boring,et al.  Rancor: A Gamified Microworld Nuclear Power Plant Simulation for Engineering Psychology Research and Process Control Applications , 2017 .

[4]  A Naweed,et al.  Designing simulator tools for rail research: the case study of a train driving microworld. , 2013, Applied ergonomics.

[5]  Ronald L. Boring,et al.  Epistemiation: An Approach for Knowledge Elicitation of Expert Users During Product Design , 2016 .

[6]  Ronald L. Boring,et al.  The Measure of human error: Direct and indirect performance shaping factors , 2007, 2007 IEEE 8th Human Factors and Power Plants and HPRCT 13th Annual Meeting.

[7]  Ronald L. Boring Envy In V&V , 2015 .

[8]  E. M. Hickling,et al.  Applicability of human reliability assessment methods to human–computer interfaces , 2012, Cognition, Technology & Work.

[9]  Lauren Reinerman-Jones,et al.  Workload from Nuclear Power Plant Task Types Across Repeated Sessions , 2014 .

[10]  Ronald L. Boring,et al.  Digital Full-Scope Simulation of a Conventional Nuclear Power Plant Control Room, Phase 2: Installation of a Reconfigurable Simulator to Support Nuclear Plant Sustainability , 2013 .

[11]  E. Hutchins Cognition in the wild , 1995 .

[12]  Ronald L. Boring,et al.  Studying Situation Awareness on a Shoestring Budget , 2015 .

[13]  Michael Hildebrandt,et al.  Micro Task Evaluation of Analog Vs. Digital Power Plant Control Room Interfaces , 2016 .

[14]  Ronald L. Boring,et al.  Measurement Sufficiency Versus Completeness: Integrating Safety Cases into Verification and Validation in Nuclear Control Room Modernization , 2017 .

[15]  Roger Lew,et al.  Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE) , 2015 .

[16]  Ronald L. Boring,et al.  A prototyping environment for research on human-machine interfaces in process control use of Microsoft WPF for microworld and distributed control system development , 2014, 2014 7th International Symposium on Resilient Control Systems (ISRCS).

[17]  Ronald L. Boring Lessons Learned Using a Full-Scale Glasstop Simulator for Control Room Modernization in Nuclear Power Plants , 2013 .

[18]  Ronald L. Boring,et al.  The Use of Simulators in Human Factors Studies Within the Nuclear Industry , 2010 .