Multimodal Warnings in Remote Operation: The Case Study on Remote Driving

Developments in sensor technology, artificial intelligence, and network technologies like 5G has made remote operation a valuable method of controlling various types of machinery. The benefits of remote operations come with an opportunity to access hazardous environments. The major limitation of remote operation is the lack of proper sensory feedback from the machine, which in turn negatively affects situational awareness and, consequently, may risk remote operations. This article explores how to improve situational awareness via multimodal feedback (visual, auditory, and haptic) and studies how it can be utilized to communicate warnings to remote operators. To reach our goals, we conducted a controlled, within-subjects experiment in eight conditions with twenty-four participants on a simulated remote driving system. Additionally, we gathered further insights with a UX questionnaire and semi-structured interviews. Gathered data showed that the use of multimodal feedback positively affected situational awareness when driving remotely. Our findings indicate that the combination of added haptic and visual feedback was considered the best feedback combination to communicate the slipperiness of the road. We also found that the feeling of presence is an important aspect of remote driving tasks, and a requested one, especially by those with more experience in operating real heavy machinery.

[1]  A. Savitzky,et al.  Smoothing and Differentiation of Data by Simplified Least Squares Procedures. , 1964 .

[2]  Klaus Krippendorff,et al.  Content Analysis: An Introduction to Its Methodology , 1980 .

[3]  David L. Akin,et al.  Human factors in space telepresence , 1983 .

[4]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[5]  Terrence Fong,et al.  Novel interfaces for remote driving: gesture, haptic, and PDA , 2001, SPIE Optics East.

[6]  David W. Hainsworth,et al.  Teleoperation User Interfaces for Mining Robotics , 2001, Auton. Robots.

[7]  Yan Zhang,et al.  Qualitative Analysis of Content by , 2005 .

[8]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[9]  Andras Varhelyi,et al.  Test of HMI Alternatives for Driver Support to Keep Safe Speed and Safe Distance - A Simulator Study , 2008 .

[10]  Matteo Massaro,et al.  Development and validation of an advanced motorcycle riding simulator , 2011 .

[11]  R. Riener,et al.  Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review , 2012, Psychonomic Bulletin & Review.

[12]  Dawn M. Tilbury,et al.  Driver Modeling for Teleoperation with Time Delay , 2014 .

[13]  Byoung-Jun Park,et al.  Augmented reality and representation in vehicle for safe driving at night , 2015, 2015 International Conference on Information and Communication Technology Convergence (ICTC).

[14]  Auditory interfaces in automated driving: an international survey , 2015 .

[15]  Frederic Emanuel Chucholowski Evaluation of Display Methods for Teleoperation of Road Vehicles , 2016 .

[16]  Markus Lienkamp,et al.  Enhancing telepresence during the teleoperation of road vehicles using HMD-based mixed reality , 2016, 2016 IEEE Intelligent Vehicles Symposium (IV).

[17]  Maurits Kaptein,et al.  Using Generalized Linear (Mixed) Models in HCI , 2016 .

[18]  Hugh Gusterson,et al.  Drone: Remote Control Warfare , 2016 .

[19]  Markus Lienkamp,et al.  Predictive Haptic Feedback for Safe Lateral Control of Teleoperated Road Vehicles in Urban Areas , 2016, 2016 IEEE 83rd Vehicular Technology Conference (VTC Spring).

[20]  Verena Nitsch,et al.  Using multisensory cues for direction information in teleoperation: More is not always better , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Paulo Menezes,et al.  A Natural Interface for Remote Operation of Underwater Robots , 2017, IEEE Computer Graphics and Applications.

[22]  Sebastiaan M. Petermeijer,et al.  Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays , 2018, Transportation Research Part F: Traffic Psychology and Behaviour.

[23]  Gerhard Rigoll,et al.  Catch My Drift: Elevating Situation Awareness for Highly Automated Driving with an Explanatory Windshield Display User Interface , 2018, Multimodal Technol. Interact..

[24]  Robyn R. Lutz,et al.  Safe-AR: Reducing Risk While Augmenting Reality , 2018, 2018 IEEE 29th International Symposium on Software Reliability Engineering (ISSRE).

[25]  Philipp Wintersberger,et al.  Teleoperation: The Holy Grail to Solve Problems of Automated Driving? Sure, but Latency Matters , 2019, AutomotiveUI.

[26]  Christian Facchi,et al.  Teleoperation , 2019, Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications.

[27]  Joel Nothman,et al.  SciPy 1.0-Fundamental Algorithms for Scientific Computing in Python , 2019, ArXiv.

[28]  Noah Goodall Non-technological challenges for the remote operation of automated vehicles , 2020 .