Multimodal focus attention and stress detection and feedback in an augmented driver simulator

This paper presents a driver simulator, which takes into account information about the user’s state of mind (level of attention, fatigue state, stress state). The user’s state of mind analysis is based on video data and biological signals. Facial movements such as eyes blinking, yawning, head rotations... are detected on video data: they are used in order to evaluate the fatigue and attention level of the driver. The user’s electrocardiogram and galvanic skin response are recorded and analyzed in order to evaluate the stress level of the driver. A driver simulator software is modified so that the system is able to appropriately react to these critical situations of fatigue and stress: some audio and visual messages are sent to the driver, wheel vibrations are generated and the driver is supposed to react to the alert messages. A multi threaded system is proposed to support multi messages sent by different modalities. Strategies for data fusion and fission are also provided.

[1]  Alice Caplier,et al.  Head nods analysis: interpretation of non verbal communication gestures , 2005, IEEE International Conference on Image Processing 2005.

[2]  J Healey,et al.  Quantifying driver stress: developing a system for collecting and processing bio-metric signals in natural situations. , 1999, Biomedical sciences instrumentation.

[3]  Antonio Torralba,et al.  An efficient neuromorphic analog network for motion estimation , 1999 .

[4]  A. Craig,et al.  Driver fatigue: electroencephalography and psychological assessment. , 2002, Psychophysiology.

[5]  Zhiwei Zhu,et al.  Real-time nonintrusive monitoring and prediction of driver fatigue , 2004, IEEE Transactions on Vehicular Technology.

[6]  Alice Caplier,et al.  Hypovigilence analysis: open or closed eye or mouth? Blinking or yawning frequency? , 2005, IEEE Conference on Advanced Video and Signal Based Surveillance, 2005..