Visualization of Spatial Sensor Data in the Context of Automotive Environment Perception Systems

Spatial sensor systems in cars are gaining more and more importance. Such sensor systems are the foundation of future safety systems, such as automatic emergency brakes, as well as for interactive driver assistance systems. We have developed a system that can visualize such spatial sensor data. Two environments are supported: a laboratory setup for off-line experience and a car setup that enables live experience of spatially aligned laser scanner and video data in real traffic. We have used two visualization devices, a video see-through LCD flat panel (TFT) and an optical see-through head-mounted display (HMD) in both setups. For the laboratory setup, a back-projection table has been integrated as well. To present data in correct spatial alignment, we have installed tracking systems in both environments. Visualization schemes for spatial sensor data and for geometric models that outline recognized objects have been developed. We report on our system and discuss experiences from the development and realization phases. The system is not intended to be used as a component of real driver assistance systems. Rather, it can bridge the gap between human machine interface (HMI) designers and sensing engineers during the development phase. Furthermore, it can be both a debugging tool for the realization of environmental perception systems and an experimental platform for the design of presentation schemes for upcoming driver assistance systems.

[1]  Warren Robinett,et al.  The Visual Display Transformation for Virtual Reality , 1995, Presence: Teleoperators & Virtual Environments.

[2]  Mohan M. Trivedi,et al.  Holistic Sensing and Active Displays for Intelligent Driver Support Systems , 2007, Computer.

[3]  Rudi Lindl,et al.  Three-Level Early Fusion for Road User Detection , 2006 .

[4]  Nassir Navab,et al.  Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[5]  Gudrun Klinker,et al.  Spatial relationship patterns: elements of reusable tracking and calibration systems , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[6]  Alois Ferscha,et al.  Pervasive information acquisition for mobile AR-navigation systems , 2003, 2003 Proceedings Fifth IEEE Workshop on Mobile Computing Systems and Applications.

[7]  Nassir Navab,et al.  Predicting and estimating the accuracy of n-occular optical tracking systems , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[8]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[9]  Martin Bauer Tracking errors in augmented reality , 2007 .

[10]  Gudrun Klinker,et al.  Hand tracking for enhanced gesture recognition on interactive multi-touch surfaces , 2007 .

[11]  Nassir Navab,et al.  Single-Point Active Alignment Method (SPAAM) for Optical See-Through HMD Calibration for Augmented Reality , 2002, Presence: Teleoperators & Virtual Environments.

[12]  L. Walchshausl,et al.  Multi-Sensor Classification using a boosted Cascade Detector , 2007, 2007 IEEE Intelligent Vehicles Symposium.

[13]  Gudrun Klinker,et al.  Effective control of a car driver's attention for visual and acoustic guidance towards the direction of imminent dangers , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[14]  Holger Regenbrecht,et al.  Augmented reality projects in the automotive and aerospace industries , 2005, IEEE Computer Graphics and Applications.