Virtual reality driving simulator for user studies on automated driving

Nowadays, HCI Research on automated driving is commonly carried out using either low-quality setups with 2D monitors or expensive driving simulators with motion platforms. Furthermore, software for user studies on automated driving is often expensive and hard to modify for different scenarios. We plan to fill this gap by proposing a low-cost, high-fidelity immersive prototyping simulator by making use of virtual reality (VR) technology: AutoWSD - Automated driving simulator for research on windshield displays. We showcase a hybrid software and hardware solution as well as demonstrate how to design and implement scenarios for user studies, and thereby encourage discussion about potential improvements and extensions for AutoWSD, as well as the topic of trust, acceptance, user experience and simulator sickness in automation.

[1]  Andreas Butz,et al.  Freehand vs. micro gestures in the car: Driving performance and user experience , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[2]  Joseph L. Gabbard,et al.  Behind the Glass: Driver Challenges and Opportunities for AR Automotive Applications , 2014, Proceedings of the IEEE.

[3]  Hermann Winner,et al.  Conduct-by-Wire : following a new paradigm for driving into the future , 2006 .

[4]  Andreas Riener Gestural Interaction in Vehicular Applications , 2012, Computer.

[5]  Philipp Wintersberger,et al.  Investigating User Preferences for Windshield Displays in Automated Vehicles , 2018, PerDis.

[6]  Zhang Hua,et al.  Speech recognition interface design for in-vehicle system , 2010, AutomotiveUI.

[7]  Julian Frommel,et al.  Evaluating VR Driving Simulation from a Player Experience Perspective , 2017, CHI Extended Abstracts.

[8]  Andreas Riener,et al.  Adaptive Dark Mode: Investigating Text and Transparency of Windshield Display Content for Automated Driving , 2019, Mensch & Computer Workshopband.

[9]  Alex Acero,et al.  Commute UX: Voice Enabled In-car Infotainment System , 2009 .

[10]  Stefan Kohlbecher,et al.  Gaze-based interaction in various environments , 2008, VNBA '08.

[11]  Tom Gross,et al.  I See Your Point: Integrating Gaze to Enhance Pointing Gesture Accuracy While Driving , 2018, AutomotiveUI.

[12]  Mikael B. Skov,et al.  You can touch, but you can't look: interacting with in-vehicle systems , 2008, CHI.

[13]  Gerhard Rigoll,et al.  Gaze-based interaction on multiple displays in an automotive environment , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.

[14]  Philipp Wintersberger,et al.  Augmented Reality Windshield Displays and Their Potential to Enhance User Experience in Automated Driving , 2019, i-com.

[15]  Robert Xiao,et al.  Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions , 2015, ICMI.

[16]  Fawzi Nashashibi,et al.  Multi-vehicle cooperative perception and augmented reality for driver assistance: A possibility to ‘see’ through front vehicle , 2011, 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC).