Traffic Participants in the Loop: A Mixed Reality-Based Interaction Testbed for the Verification and Validation of Autonomous Vehicles

In order to verify and validate autonomous vehicles, testbeds integrating the whole system from perception to actuation are necessary. Above all, this applies to the assessment of the autonomous vehicle's performance in the presence of vulnerable road users' behavior: While experiments for the interaction between autonomous vehicles and pedestrians in critical traffic scenarios are hardly conductable in reality, virtual experiments often suffer from plausibility. In order to solve this issue, we present a mixed reality testbed for the verification and validation of autonomous vehicles faced with realistic road user behavior in critical, worst case traffic scenarios. We achieve this by registering an immersed pedestrian and the automated driving function within a common environment model, providing challenging traffic scenarios. The testbed is applicable within different integration levels of the automated driving function and enables a high level of behavioral realism. The testbed is evaluated qualitatively and discussed within a concrete use case.

[1]  Germán Ros,et al.  CARLA: An Open Urban Driving Simulator , 2017, CoRL.

[2]  Patrick Weber,et al.  OpenStreetMap: User-Generated Street Maps , 2008, IEEE Pervasive Computing.

[3]  Bruce A. MacDonald,et al.  A Flexible Mixed Reality Simulation Framework for Software Development in Robotics , 2011 .

[4]  Tobias Bär,et al.  Robust, Marker-Based Head Tracking for Testing Cognitive Vehicles in the Loop , 2014, ISVC.

[5]  Sebastian Thrun,et al.  Towards fully autonomous driving: Systems and algorithms , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[6]  Alberto Broggi,et al.  PROUD-Public road urban driverless test: Architecture and results , 2014, 2014 IEEE Intelligent Vehicles Symposium Proceedings.

[7]  Julius Ziegler,et al.  Lanelets: Efficient map representation for autonomous driving , 2014, 2014 IEEE Intelligent Vehicles Symposium Proceedings.

[8]  Bernd Fröhlich,et al.  Immersive Group-to-Group Telepresence , 2013, IEEE Transactions on Visualization and Computer Graphics.

[9]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[10]  Thomas Schamm,et al.  Testing of Advanced Driver Assistance Towards Automated Driving: A Survey and Taxonomy on Existing Approaches and Open Questions , 2015, 2015 IEEE 18th International Conference on Intelligent Transportation Systems.

[11]  Michel Verhaegen,et al.  Development of a driver information and warning system with vehicle-hardware-in-the-loop simulations , 2009 .

[12]  Tobias Bär,et al.  Consistent Test Method for Assistance Systems , 2014 .

[13]  Martijn van Noort,et al.  A simulation tool suite for developing connected vehicle systems , 2013, 2013 IEEE Intelligent Vehicles Symposium (IV).

[14]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  T. Bock,et al.  Validation of the Vehicle in the Loop (VIL); A milestone for the simulation of driver assistance systems , 2007, 2007 IEEE Intelligent Vehicles Symposium.

[16]  Fumio Kishino,et al.  Augmented reality: a class of displays on the reality-virtuality continuum , 1995, Other Conferences.

[17]  Mark T. Bolas,et al.  Mixed reality for robotics , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[18]  Wim Desmet,et al.  “Pedestrian in the loop”: An approach using virtual reality , 2017, 2017 XXVI International Conference on Information, Communication and Automation Technologies (ICAT).

[19]  Helbing,et al.  Congested traffic states in empirical observations and microscopic simulations , 2000, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[20]  Ralf Kohlhaas,et al.  The sleepwalker framework: Verification and validation of autonomous vehicles by mixed reality LiDAR stimulation , 2018, 2018 IEEE International Conference on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR).

[21]  Ralf Kohlhaas,et al.  Semivirtual simulations for the evaluation of vision-based ADAS , 2014, 2014 IEEE Intelligent Vehicles Symposium Proceedings.

[22]  Hermann Winner,et al.  Development and Validation of Manoeuvre-Based Driver Assistance Functions for Conduct-by-Wire with IPG CarMaker , 2010 .

[23]  Dinesh Manocha,et al.  Menge: A Modular Framework for Simulating Crowd Movement , 2016 .

[24]  Qiao Wang,et al.  VirtualWorlds as Proxy for Multi-object Tracking Analysis , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[25]  Eugen Schubert,et al.  Evaluation of pedestrian targets for use in automomous emergengy brake system testing: a report from the harmonistion platform 2 dealing with test equipment , 2013 .

[26]  Julius Ziegler,et al.  Making Bertha Drive—An Autonomous Journey on a Historic Route , 2014, IEEE Intelligent Transportation Systems Magazine.

[27]  Antonio M. López,et al.  The SYNTHIA Dataset: A Large Collection of Synthetic Images for Semantic Segmentation of Urban Scenes , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[28]  Dinesh Manocha,et al.  PedVR: simulating gaze-based interactions between a real user and virtual crowds , 2016, VRST.

[29]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.