Sharing Control Between Humans and Automation Using Haptic Interface: Primary and Secondary Task Performance Benefits

This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.

[1]  R. Brent Gillespie,et al.  Haptic interface for hands-on instruction in system dynamics and embedded control , 2003, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings..

[2]  Håkan Jansson,et al.  An analysis of driver’s steering behaviour during auditory or haptic warnings for the designing of lane departure warning system , 2003 .

[3]  Gavriel Salvendy,et al.  Handbook of Human Factors and Ergonomics , 2005 .

[4]  G E Raney,et al.  Monitoring changes in cognitive load during reading: an event-related brain potential and reaction time analysis. , 1993, Journal of experimental psychology. Learning, memory, and cognition.

[5]  D. Woods,et al.  Automation Surprises , 2001 .

[6]  Shahram Payandeh,et al.  On application of virtual fixtures as an aid for telemanipulation and training , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[7]  Lisanne Bainbridge,et al.  Ironies of automation , 1982, Autom..

[8]  C. Spence,et al.  Crossmodal Space and Crossmodal Attention , 2004 .

[9]  R. Ulrich,et al.  Effects of truncation on reaction time analysis. , 1994, Journal of experimental psychology. General.

[10]  Donald L. Fisher,et al.  Sound Localization: Information Theory Analysis , 1998, Hum. Factors.

[11]  C D Wickens,et al.  Codes and Modalities in Multiple Resources: A Success and a Qualification , 1988, Human factors.

[12]  Robert D. Howe,et al.  Virtual Fixtures for Robotic Cardiac Surgery , 2001, MICCAI.

[13]  R. Ratcliff Methods for dealing with reaction time outliers. , 1993, Psychological bulletin.

[14]  David Woods,et al.  1. How to make automated systems team players , 2002 .

[15]  R. Brent Gillespie,et al.  Feedback-stabilized minimum distance maintenance for convex parametric surfaces , 2005, IEEE Transactions on Robotics.

[16]  Kathleen L Mosier,et al.  5. Automation and cognition: maintaining coherence in the electronic cockpit , 2002 .

[17]  T D Gillespie,et al.  Fundamentals of Vehicle Dynamics , 1992 .

[18]  David E. Meyer,et al.  Executive control of cognitive processes in task switching. , 2001 .

[19]  D. Meyer,et al.  Executive control of cognitive processes in task switching. , 2001, Journal of experimental psychology. Human perception and performance.

[20]  Jon Driver,et al.  Crossmodal Spatial Attention: Evidence from Human Performance , 2004 .

[21]  Richard P. Paul,et al.  An Operator Interface for Teleprogramming Employing Synthetic Fixtures , 1994, Presence: Teleoperators & Virtual Environments.

[22]  A. Modjtahedzadeh,et al.  A control theoretic model of driver steering behavior , 1990, IEEE Control Systems Magazine.

[23]  R. Brent Gillespie,et al.  Extremal distance maintenance for parametric curves and surfaces , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[24]  J. Christian Gerdes,et al.  Handwheel Force Feedback for Lanekeeping Assistance: Combined Dynamics and Stability , 2006 .

[25]  Omer Tsimhoni,et al.  Visual Demand of Driving and the Execution of Display-Intensive in-Vehicle Tasks , 2001 .

[26]  Vincent Hayward,et al.  Haptic interfaces and devices , 2004 .

[27]  Nadine Sarter,et al.  2. Multimodal information presentation in support of human-automation communication and coordination , 2002 .

[28]  Maria C. Yang,et al.  Haptic Force-Feedback Devices for the Office Computer: Performance and Musculoskeletal Loading Issues , 2001, Hum. Factors.

[29]  Allison M. Okamura,et al.  Recognition of operator motions for real-time assistance using virtual fixtures , 2003, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings..

[30]  Nancy J. Cooke,et al.  Advances in Human Performance and Cognitive Engineering Research , 2002 .

[31]  E. Donges A control theoretic model of driver steering behavior , 1977 .

[32]  J Godthelp,et al.  EXPLORATORY SIMULATOR STUDY ON THE USE OF ACTIVE CONTROL DEVICES IN CAR DRIVING , 1992 .

[33]  R Gillespie Haptic Interface to Virtual Environments , 2004 .