Assessment of UAV Operator Workload in A Reconfigurable Multi-Touch Ground Control Station Environment

Multi-touch computer inputs allow users to interact with a virtual environment through the use of gesture commands on a monitor instead of a mouse and keyboard. This style of input is easy for the human mind to adapt to because gestures directly reflect how one interacts with the natural environment. This paper presents and assesses a personal-computer-based unmanned aerial vehicle ground control station that utilizes multi-touch gesture inputs and system reconfigurability to enhance operator performance. The system was developed at Ryerson University’s Mixed-Reality Immersive Motion Simulation Laboratory using commercial-off-the-shelf Presagis software. The ground control station was then evaluated using NASA’s task load index to determine if the inclusion of multi-touch gestures and reconfigurability provided an improvement in operator workload over the more traditional style of mouse and keyboard inputs. To conduct this assessment, participants were tasked with flying a simulated aircraft through a spe...

[1]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[2]  Weihua Sheng,et al.  Human gesture recognition through a Kinect sensor , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[3]  Kun Su Yoon,et al.  An efficient embedded system architecture for pilot training , 2011, 2011 IEEE/AIAA 30th Digital Avionics Systems Conference.

[4]  Kilseop Ryu,et al.  Evaluation of mental workload with a combined measure based on physiological indices during a dual task of tracking and mental arithmetic , 2005 .

[5]  Gerd Bruder,et al.  To touch or not to touch?: comparing 2D touch and 3D mid-air interaction on stereoscopic tabletop surfaces , 2013, SUI '13.

[6]  Andrea Sanna,et al.  Multi-touch user interface evaluation for 3D object manipulation on mobile devices , 2009, Journal on Multimodal User Interfaces.

[7]  Carl Sandom,et al.  Operator situational awareness and system safety , 2000 .

[8]  Mahsan Rofouei,et al.  Your phone or mine?: fusing body, touch and device sensing for multi-user device-display interaction , 2012, CHI.

[9]  Alan Hedge,et al.  Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction , 2001 .

[10]  Mrudula Nimbarte Multi-Touch Screen Interfaces And Gesture Analysis: A Study , 2011 .

[11]  Tom Wypych,et al.  CGLXTouch: A multi-user multi-touch approach for ultra-high-resolution collaborative workspaces , 2011, Future Gener. Comput. Syst..

[12]  J Ali,et al.  Human facial neural activities and gesture recognition for machine-interfacing applications , 2011, International journal of nanomedicine.

[13]  Jürgen Beyerer,et al.  Lift-and-drop: crossing boundaries in a multi-display environment by Airlift , 2010, AVI.

[14]  René van Paassen,et al.  Haptic Feedback for UAV Tele-operation - Force offset and spring load modification , 2006, 2006 IEEE International Conference on Systems, Man and Cybernetics.

[15]  Sung H. Han,et al.  Touch key design for one-handed thumb interaction with a mobile phone: Effects of touch key size and touch key location , 2010 .

[16]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[17]  Malin Nord,et al.  Multi-touch in control systems : Two case studies , 2010 .

[18]  Mladjan Jovanovic,et al.  Improving Design of Ground Control Station for Unmanned Aerial Vehicle: Borrowing from Design Patterns , 2010, 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications.

[19]  J M Koonce,et al.  A Historical Overview of Human Factors in Aviation , 2009 .

[20]  Heath A. Ruff,et al.  Human Interaction with Levels of Automation and Decision-Aid Fidelity in the Supervisory Control of Multiple Simulated Unmanned Air Vehicles , 2002, Presence: Teleoperators & Virtual Environments.

[21]  Holly A. Yanco,et al.  Improving Human-Robot Interaction through Interface Evolution , 2010 .

[22]  David R. Hunter,et al.  Aviation Psychology and Human Factors , 2009 .

[23]  Francis K. H. Quek,et al.  iPhone/iPod Touch as Input Devices for Navigation in Immersive Virtual Environments , 2009, 2009 IEEE Virtual Reality Conference.

[24]  Joaquim A. Jorge,et al.  Towards accessible touch interfaces , 2010, ASSETS '10.

[25]  Feng Jiang,et al.  Viewpoint-independent hand gesture recognition with Kinect , 2014 .

[26]  James L. Szalma,et al.  Comparison of Measures Used to Assess the Workload of Monitoring an Unmanned System in a Simulation Mission , 2015 .

[27]  Dominik Schmidt,et al.  A Comparison of Direct and Indirect Multi-touch Input for Large Surfaces , 2009, INTERACT.

[28]  Ahmet Yapici,et al.  Ground control station avionics software development in ANKA UAV , 2011, 2011 IEEE/AIAA 30th Digital Avionics Systems Conference.

[29]  Hyunjeong Lee,et al.  Touch play pool: Touch gesture interaction for mobile multifunction devices , 2012, 2012 IEEE International Conference on Consumer Electronics (ICCE).

[30]  Marcelo R. Campo,et al.  Easy gesture recognition for Kinect , 2014, Adv. Eng. Softw..

[31]  Yang Li,et al.  Experimental analysis of touch-screen gesture designs in mobile environments , 2011, CHI.

[32]  Sean Hayes,et al.  Multi-touch interaction for tasking robots , 2010, HRI 2010.

[33]  Robin R. Murphy,et al.  On the Human–Machine Interaction of Unmanned Aerial System Mission Specialists , 2013, IEEE Transactions on Human-Machine Systems.

[34]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[35]  Daniel Soto-Guerrero,et al.  A human-machine interface with unmanned aerial vehicles , 2013, CCE 2013.

[36]  Shahram Sarkani,et al.  Minimizing Human Factors Mishaps in Unmanned Aircraft Systems , 2013 .

[37]  L. G. Weiss,et al.  Autonomous robots in the fog of war , 2011, IEEE Spectrum.