Social Cars: Sensing, Gathering, Sharing, and Conveying Social Cues to Road Users

Intelligent Transport Systems (ITS) resembles the infrastructure for ubiquitous computing in the car. It encompasses a) all kinds of sensing technologies within vehicles as well as road infrastructure, b) wireless communication protocols for the sensed information to be exchanged between vehicles (V2V) and between vehicles and infrastructure (V2I), and c) appropriate intelligent algorithms and computational technologies that process these real-time streams of information. As such, ITS can be considered a game changer. It provides the fundamental basis of new, innovative concepts and applications, similar to the Internet itself. The information sensed or gathered within or around the vehicle has led to a variety of context-aware in-vehicular technologies within the car. A simple example is the Anti-lock Breaking System (ABS), which releases the breaks when sensors detect that the wheels are locked. We refer to this type of context awareness as vehicle/technology awareness. V2V and V2I communication, often summarized as V2X, enables the exchange and sharing of sensed information amongst cars. As a result, the vehicle/technology awareness horizon of each individual car is expanded beyond its observable surrounding, paving the way to technologically enhance such already advanced systems. In this chapter, we draw attention to those application areas of sensing and V2X technologies, where the human (driver), the human’s behavior and hence the psychological perspective plays a more pivotal role. The focal points of our project are illustrated in Figure 1: In all areas, the vehicle first (1) gathers or senses information about the driver. Rather than to limit the use of such information towards vehicle/technology awareness, we see great potential for applications in which this sensed information is then (2) fed back to the driver for an increased self-awareness. In addition, by using V2V technologies, it can also be (3) passed to surrounding drivers for an increased social awareness, or (4), pushed even further, into the cloud, where it is collected and visualized for an increased, collective urban awareness within the urban community at large, which includes all city dwellers.

[1]  Sandro Castronovo,et al.  Physical and spiritual proximity: linking Car2X communication with online social networks , 2012, AutomotiveUI.

[2]  Suleman Shahid,et al.  DriveRS: An In-Car Persuasive System for Making Driving Safe and Fun , 2012, Advances in Computer Entertainment.

[3]  Rune Elvik,et al.  The Handbook of Road Safety Measures , 2009 .

[4]  Björn W. Schuller,et al.  Speaker Independent Speech Emotion Recognition by Ensemble Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[5]  B. Schuller,et al.  Recognition of Spontaneous Emotions by Speech within Automotive Environment , 2006 .

[6]  David W. McDonald,et al.  Theory-driven design strategies for technologies that support behavior change in everyday life , 2009, CHI.

[7]  Richard Wener,et al.  Mobile telephones, distracted attention, and pedestrian safety. , 2008, Accident; analysis and prevention.

[8]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[9]  Björn W. Schuller,et al.  Emotion on the Road - Necessity, Acceptance, and Feasibility of Affective Computing in the Car , 2010, Adv. Hum. Comput. Interact..

[10]  Mohan M. Trivedi,et al.  On the design and evaluation of robust head pose for visual user interfaces: algorithms, databases, and comparisons , 2012, AutomotiveUI.

[11]  Manfred Tscheligi,et al.  Acceptance of future persuasive in-car interfaces towards a more economic driving behaviour , 2009, AutomotiveUI.

[12]  Wendy A. Kellogg,et al.  Social translucence: an approach to designing systems that support social processes , 2000, TCHI.

[13]  Clifford Nass,et al.  Computers that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent , 2005, Int. J. Hum. Comput. Stud..

[14]  Mirco Musolesi,et al.  Sensing meets mobile social networks: the design, implementation and evaluation of the CenceMe application , 2008, SenSys '08.

[15]  Jennifer Healey,et al.  SmartCar: detecting driver stress , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[16]  Andry Rakotonirainy,et al.  In-Vehicle Avatars to Elicit Social Response and Change Driving Behaviour , 2009, Int. J. Technol. Hum. Interact..

[17]  Elgar Fleisch,et al.  Providing eco-driving feedback to corporate car drivers: what impact does a smartphone application have on their fuel efficiency? , 2012, UbiComp.

[18]  W. Velicer,et al.  The Transtheoretical Model of Health Behavior Change , 1997, American journal of health promotion : AJHP.

[19]  Eric Paulos,et al.  Citizen Science , 2018, Handbook of Research on Urban Informatics.

[20]  Björn W. Schuller,et al.  On the Necessity and Feasibility of Detecting a Driver's Emotional State While Driving , 2007, ACII.

[21]  Demetrios Zeinalipour-Yazti,et al.  Crowdsourcing with Smartphones , 2012, IEEE Internet Computing.

[22]  Jennifer Healey,et al.  M2M gossip: why might we want cars to talk about us? , 2012, AutomotiveUI.

[23]  O. Svenson ARE WE ALL LESS RISKY AND MORE SKILLFUL THAN OUR FELLOW DRIVERS , 1981 .

[24]  Lorenzo Torresani,et al.  CarSafe: a driver safety app that detects dangerous driving behavior using dual-cameras on smartphones , 2012, UbiComp.

[25]  Saul Greenberg,et al.  One size does not fit all: applying the transtheoretical model to energy feedback technology design , 2010, CHI.

[26]  Ing-Marie Jonsson,et al.  Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses , 2005, OZCHI.

[27]  Joseph B. Walther,et al.  The Impacts of Emoticons on Message Interpretation in Computer-Mediated Communication , 2001 .

[28]  Myounghoon Jeon,et al.  A systematic approach to using music for mitigating affective effects on driving performance and safety , 2012, UbiComp.

[29]  P. Ellison-Potter,et al.  The Effects of Trait Driving Anger, Anonymity, and Aggressive Stimuli on Aggressive Driving Behavior , 2001 .

[30]  Ing-Marie Jonsson,et al.  Performance Analysis of Acoustic Emotion Recognition for In-Car Conversational Interfaces , 2007, HCI.

[31]  Ronald Schroeter,et al.  The social car: new interactive vehicular applications derived from social media and urban informatics , 2012, AutomotiveUI.

[32]  Clifford Nass,et al.  Improving automotive safety by pairing driver emotion and car voice emotion , 2005, CHI Extended Abstracts.

[33]  Lawrence J Cook,et al.  A comparison of aggressive and DUI crashes. , 2005, Journal of safety research.

[34]  Andreas Krause,et al.  Toward Community Sensing , 2008, 2008 International Conference on Information Processing in Sensor Networks (ipsn 2008).

[35]  Michael J. Lyons,et al.  Coding facial expressions with Gabor wavelets , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[36]  Gerhard Rigoll,et al.  Bimodal fusion of emotional data in an automotive environment , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[37]  Carlo Ratti,et al.  Mobile Landscapes: Using Location Data from Cell Phones for Urban Analysis , 2006 .

[38]  Paul Dourish,et al.  What we talk about when we talk about context , 2004, Personal and Ubiquitous Computing.

[39]  Zhengyou Zhang,et al.  Comparison between geometry-based and Gabor-wavelets-based facial expression recognition using multi-layer perceptron , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[40]  Mark J. King,et al.  The validity of simulators in studying drivingbehaviours , 2010 .

[41]  Clifford Nass,et al.  Emotion regulation for frustrating driving contexts , 2011, CHI.

[42]  Gwen Littlewort,et al.  Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction. , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[43]  R. S. Lynch,et al.  The driving anger expression inventory: a measure of how people express their anger on the road. , 2002, Behaviour research and therapy.

[44]  Bryan Reimer,et al.  Monitoring, managing, and motivating driver safety and well-being , 2011, IEEE Pervasive Computing.

[45]  Timothy F. Cootes,et al.  Active Shape Models-Their Training and Application , 1995, Comput. Vis. Image Underst..

[46]  Mohan M. Trivedi,et al.  On-road prediction of driver's intent with multimodal sensory cues , 2011, IEEE Pervasive Computing.

[47]  Mirco Musolesi,et al.  The Rise of People-Centric Sensing , 2008, IEEE Internet Comput..

[48]  Ing-Marie Jonsson,et al.  Using Paralinguistic Cues in Speech to Recognise Emotions in Older Car Drivers , 2008, Affect and Emotion in Human-Computer Interaction.

[49]  Yingzi Lin,et al.  Study on Driver Emotion in Driver-Vehicle-Environment Systems Using Multiple Networked Driving Simulators , 2007 .

[50]  Fred Nicolls,et al.  Locating Facial Features with an Extended Active Shape Model , 2008, ECCV.

[51]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[52]  Gloria J. Leckie,et al.  The Public Place of Central Libraries: Findings from Toronto and Vancouver , 2002, The Library Quarterly.

[53]  Ronald Schroeter,et al.  The future shape of digital cars , 2012 .

[54]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..