The impact of surface projection on military tactics comprehension

ABSTRACT This experiment assessed how displaying information onto different surfaces (flat vs. raised) influenced the performance, workload, and engagement of cadets answering questions on military tactics. Sixty-two cadets in a within-subjects design each answered 24 tactics-related questions across 2 conditions (12 on flat, 12 on raised) which were measured by accuracy and time on task. After each set of 12 questions, the cadets took postsurveys assessing engagement, measured by a modified User Engagement Scale and the System Usability Scale, and workload measured by the NASA-TLX. Findings indicated that raised terrain surface led to reduced workload and increased engagement and time on task as compared to the flat terrain surface. A practice effect drove performance metrics (time on task and accuracy), where the learner performed better on the second surface type displayed. This research contributes to expanding the literature base that supports alternative display methods to increase engagement and augment instruction of military tactics tasks.

[1]  Sharon Dixon,et al.  Human factors guidelines for applications of 3D perspectives: a literature review , 2009, Defense + Commercial Sensing.

[2]  J. R. Pomerantz,et al.  Visual discrimination of texture , 1978, Perception & psychophysics.

[3]  J. Schafer Multiple imputation: a primer , 1999, Statistical methods in medical research.

[4]  Bradley M. Davis Effects of Tactical Navigation Display Modality on Navigation Performance, Situation Awareness, and Mental Workload , 2006 .

[6]  N. J. Slamecka,et al.  The Generation Effect: Delineation of a Phenomenon , 1978 .

[7]  Harvey S. Smallman,et al.  The Use of 2D and 3D Displays for Shape-Understanding versus Relative-Position Tasks , 2001, Hum. Factors.

[8]  R. Yerkes,et al.  The relation of strength of stimulus to rapidity of habit‐formation , 1908 .

[9]  Mark H. Chignell,et al.  Mental workload dynamics in adaptive interface design , 1988, IEEE Trans. Syst. Man Cybern..

[10]  J. R. Pomerantz,et al.  Emergent features, attention, and perceptual glue in visual form perception. , 1989, Journal of experimental psychology. Human perception and performance.

[11]  C. Y. Peng,et al.  Principled missing data methods for researchers , 2013, SpringerPlus.

[12]  Eric N. Wiebe,et al.  Measuring engagement in video game-based environments: Investigation of the User Engagement Scale , 2014, Comput. Hum. Behav..

[13]  F. Paas,et al.  Cognitive Architecture and Instructional Design , 1998 .

[14]  Christopher D. Wickens,et al.  Processing resources and attention , 2020 .

[15]  Paul A. Cairns,et al.  An empirical evaluation of the User Engagement Scale (UES) in online news environments , 2015, Inf. Process. Manag..

[16]  Elaine Toms,et al.  What is user engagement? A conceptual framework for defining user engagement with technology , 2008, J. Assoc. Inf. Sci. Technol..

[17]  Latvijas Universitāte,et al.  The Baltic international yearbook of cognition, logic and communication , 2007 .

[18]  F. Paas,et al.  Cognitive Load Theory and Instructional Design: Recent Developments , 2003 .

[19]  Vernoi Battiste,et al.  Transport Pilot Workload: A Comparison of Two Subjective Techniques , 1988 .

[20]  Carole R. Beal,et al.  EEG estimates of engagement and cognitive workload predict math problem solving outcomes , 2012, UMAP.

[21]  Emanuel Schmider,et al.  Is It Really Robust , 2010 .

[22]  Stephan Reichelt,et al.  Depth cues in human visual perception and their realization in 3D displays , 2010, Defense + Commercial Sensing.

[23]  John P. McIntire,et al.  Stereoscopic 3D displays and human performance: A comprehensive review , 2014, Displays.

[24]  Elaine Toms,et al.  The development and evaluation of a survey to measure user engagement , 2010, J. Assoc. Inf. Sci. Technol..

[25]  Thomas Rist,et al.  Referring Phenomena in a Multimedia Context and their Computational Treatment , 1997 .

[26]  Christopher D. Wickens,et al.  Two- and Three-Dimensional Displays for Aviation: A Theoretical and Empirical Comparison , 1993 .

[27]  Tan-Hsu Tan,et al.  Outdoor Natural Science Learning with an RFID-Supported Immersive Ubiquitous Learning Environment , 2009, J. Educ. Technol. Soc..

[28]  Stanley N. Roscoe,et al.  Airborne Displays for Flight and Navigation , 1968 .

[29]  Christopher D. Wickens,et al.  When Users Want What's not Best for Them , 1995 .

[30]  Philip T. Kortum,et al.  Determining what individual SUS scores mean: adding an adjective rating scale , 2009 .

[31]  Melanie Tory,et al.  Visualization task performance with 2D, 3D, and combination displays , 2006, IEEE Transactions on Visualization and Computer Graphics.

[32]  R. Little A Test of Missing Completely at Random for Multivariate Data with Missing Values , 1988 .

[33]  Benjamin Goldberg,et al.  Effect of Topography on Learning Military Tactics - Integration of Generalized Intelligent Framework for Tutoring (GIFT) and Augmented REality Sandtable (ARES) , 2016 .

[34]  Robert W. Proctor,et al.  Human factors in simple and complex systems , 1993 .

[35]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[36]  Christopher D. Wickens,et al.  Mental Workload: Assessment, Prediction and Consequences , 2017, H-WORKLOAD.

[37]  John Sweller,et al.  Working Memory, Long-term Memory, and Instructional Design , 2016 .

[38]  C D Wickens,et al.  Implications of Graphics Enhancements for the Visualization of Scientific Data: Dimensional Integrality, Stereopsis, Motion, and Mesh , 1994, Human factors.

[39]  Thomas F. Shipley,et al.  Learning to interpret topographic maps: Understanding layered spatial information , 2016, Cognitive research: principles and implications.

[40]  Steven Andrew Culpepper,et al.  Fostering Students' Comprehension of Topographic Maps , 2007 .

[41]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[42]  Xuwei Chen,et al.  A comparison of usefulness of 2D and 3D representations of urban planning , 2015 .

[43]  Carlos Carbonell Carrera,et al.  Map-Reading Skill Development with 3D Technologies , 2017 .

[44]  Laura A. Whitlock,et al.  Ergonomics in Design : The Quarterly of Human Factors Applications , 2012 .

[45]  Woodrow Barfield,et al.  Judgments of Azimuth and Elevation as a Function of Monoscopic and Binocular Depth Cues Using a Perspective Display , 1995, Hum. Factors.

[46]  Christopher D. Wickens The When and How of Using 2-D and 3-D Displays for Operational Tasks , 2000 .

[47]  Ehud Sharlin,et al.  Visibility perception and dynamic viewsheds for topographic maps and models , 2017, SUI.

[48]  Roger Smith,et al.  The Long History of Gaming in Military Training , 2010 .

[49]  Slava Kalyuga,et al.  Rethinking the Boundaries of Cognitive Load Theory in Complex Learning , 2016 .

[50]  A. Treisman,et al.  Emergent features, attention, and object perception. , 1984, Journal of experimental psychology. Human perception and performance.

[51]  Jonathan P. Rowe,et al.  Integrating Learning, Problem Solving, and Engagement in Narrative-Centered Learning Environments , 2011, Int. J. Artif. Intell. Educ..

[52]  Kelly S. Hale,et al.  Augmented REality Sandtables (ARESs) Impact on Learning , 2016 .

[53]  Heather L. O'Brien,et al.  Mixed-methods approach to measuring user experience in online news interactions , 2013, J. Assoc. Inf. Sci. Technol..

[54]  Peter A Hancock,et al.  State of science: mental workload in ergonomics , 2015, Ergonomics.

[55]  J. B. Brooke,et al.  SUS: a retrospective , 2013 .

[56]  Robert A. Sottilare,et al.  An Updated Concept for a Generalized Intelligent Framework for Tutoring (GIFT) , 2017 .

[57]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[58]  Robert A. Sottilare,et al.  The Generalized Intelligent Framework for Tutoring (GIFT) , 2012 .

[59]  Benjamin Goldberg,et al.  Adaptive Training Across Simulations in Support of a Crawl-Walk-Run Model of Interaction , 2017, HCI.

[60]  J. Sweller,et al.  Reducing cognitive load by mixing auditory and visual presentation modes , 1995 .

[61]  Mark S. Young,et al.  Attention and automation: New perspectives on mental underload and performance , 2002 .

[62]  Steven M. Weisberg,et al.  The Lay of the Land: Sensing and Representing Topography , 2015 .

[63]  Majid Fallahi,et al.  Effects of mental workload on physiological and subjective responses during traffic density monitoring: A field study. , 2016, Applied ergonomics.

[64]  Tobias Isenberg,et al.  Lightweight Relief Shearing for Enhanced Terrain Perception on Interactive Maps , 2015, CHI.

[65]  J. Sweller COGNITIVE LOAD THEORY, LEARNING DIFFICULTY, AND INSTRUCTIONAL DESIGN , 1994 .

[66]  Frank W. Brewster Using Tactical Decision Exercises to Study Tactics , 2002 .

[67]  James M. Walker,et al.  Measuring workload of ICU nurses with a questionnaire survey: the NASA Task Load Index (TLX) , 2011, IIE transactions on healthcare systems engineering.

[68]  Woodrow Barfield,et al.  Presence in virtual environments as a function of visual and auditory cues , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[69]  Zhi-ming Wang,et al.  [The appraisal of reliability and validity of subjective workload assessment technique and NASA-task load index]. , 2005, Zhonghua lao dong wei sheng zhi ye bing za zhi = Zhonghua laodong weisheng zhiyebing zazhi = Chinese journal of industrial hygiene and occupational diseases.

[70]  Stephen H. Fairclough,et al.  Measuring task engagement as an input to physiological computing , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[71]  Christopher D. Wickens,et al.  The Proximity Compatibility Principle: Its Psychological Foundation and Relevance to Display Design , 1995, Hum. Factors.

[72]  Mounia Lalmas,et al.  Measuring User Engagement , 2014, Measuring User Engagement.