Teleoperated trajectory tracking of remotely operated vehicles using spatial auditory interface

Abstract: The majority of Human-Machine-Interfaces (HMIs) designed for teleoperation of Unmanned Vehicles (UVs) present information only visually. Frequent overloading of operator’s visual channel may cause unwanted mishaps. In order to reduce such mishaps and improve overall operating performance, this paper proposes the extension of the HMI by hearing modality in the form of spatial auditory display for trajectory tracking, the most complex guidance task. The main novelty of the interface is introduction of guidance laws to generate the reference presented to the operator as a spatial auditory image of the virtual target to be followed. Guidance laws for teleoperated trajectory tracking are based on modified "lookahead distance" strategy, known from path following applications. The paper also analyzes the stability of the kinematic controller based on this new guidance control law. The experiments show that the novel guidance strategy provides comprehensible and effective reference providing excellent trajectory tracking performance.

[1]  Edin Omerdic,et al.  ROV LATIS: next generation smart underwater vehicle , 2012 .

[2]  Elizabeth M. Wenzel,et al.  Prototype Spatial Auditory Display for Remote Planetary Exploration , 2012 .

[3]  Antonio M. Pascoal,et al.  Nonlinear path following with applications to the control of autonomous underwater vehicles , 2003, 42nd IEEE International Conference on Decision and Control (IEEE Cat. No.03CH37475).

[4]  Alan F. Smeaton,et al.  Spatially Augmented Audio Delivery: Applications of Spatial Sound Awareness in Sensor-Equipped Indoor Environments , 2009, 2009 Tenth International Conference on Mobile Data Management: Systems, Services and Middleware.

[5]  Brian F. G. Katz,et al.  Audio haptic feedbacks for an acquisition task in a multi-target context , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[6]  Herbert Peremans,et al.  Biomimetic Sonar: Binaural 3D Localization using Artificial Bat Pinnae , 2011, Int. J. Robotics Res..

[7]  Paul Newman MOOS - Mission Orientated Operating Suite , 2008 .

[8]  Bruce N. Walker,et al.  Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice , 2006, Hum. Factors.

[9]  Thor I. Fossen,et al.  Guidance Laws for Autonomous Underwater Vehicles , 2009 .

[10]  Zoran Vukic,et al.  Acoustically aided HMI for ROV navigation , 2012 .

[11]  B. Shinn-Cunningham Applications of virtual auditory displays , 1998, Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Vol.20 Biomedical Engineering Towards the Year 2000 and Beyond (Cat. No.98CH36286).

[12]  Nada J. Pavlovic,et al.  Human Factors Issues with Operating Unmanned Underwater Vehicles , 2011 .

[13]  Carlos Silvestre,et al.  Vehicle and mission control of single and multiple autonomous marine robots , 2006 .

[14]  Nikola Miskovic,et al.  Comparative assessment of human machine interfaces for ROV guidance with different levels of secondary visual workload , 2013, 21st Mediterranean Conference on Control and Automation.

[15]  N.I. Durlach,et al.  Supernormal Auditory LocalizationI. General Background , 1993, Presence: Teleoperators & Virtual Environments.

[16]  Zoran Vukic,et al.  Auditory Interface for Teleoperation - Path Following Experimental Results , 2014 .

[17]  Barbara G. Shinn-Cunningham,et al.  Supernormal Auditory Localization , 1993, Presence Teleoperators Virtual Environ..

[18]  Marco Bibuli,et al.  Guidance of Unmanned Surface Vehicles: Experiments in Vehicle Following , 2012, IEEE Robotics & Automation Magazine.

[19]  T.I. Fossen,et al.  Path following for marine surface vessels , 2004, Oceans '04 MTS/IEEE Techno-Ocean '04 (IEEE Cat. No.04CH37600).