Adasa: A Conversational In-Vehicle Digital Assistant for Advanced Driver Assistance Features

Advanced Driver Assistance Systems (ADAS) come equipped on most modern vehicles and are intended to assist the driver and enhance the driving experience through features such as lane keeping system and adaptive cruise control. However, recent studies show that few people utilize these features for several reasons. First, ADAS features were not common until recently. Second, most users are unfamiliar with these features and do not know what to expect. Finally, the interface for operating these features is not intuitive. To help drivers understand ADAS features, we present a conversational in-vehicle digital assistant that responds to drivers' questions and commands in natural language. With the system prototyped herein, drivers can ask questions or command using unconstrained natural language in the vehicle, and the assistant trained by using advanced machine learning techniques, coupled with access to vehicle signals, responds in real-time based on conversational context. Results of our system prototyped on a production vehicle are presented, demonstrating its effectiveness in improving driver understanding and usability of ADAS.

[1]  Ephraim R. McLean,et al.  Information Systems Success: The Quest for the Dependent Variable , 1992, Inf. Syst. Res..

[2]  John H. L. Hansen,et al.  "CU-move" : analysis & corpus development for interactive in-vehicle speech systems , 2001, INTERSPEECH.

[3]  Timothy L. Brown,et al.  Speech-Based Interaction with In-Vehicle Computers: The Effect of Speech-Based E-Mail on Drivers' Attention to the Roadway , 2001, Hum. Factors.

[4]  Ephraim R. McLean,et al.  The DeLone and McLean Model of Information Systems Success: A Ten-Year Update , 2003, J. Manag. Inf. Syst..

[5]  Clifford Nass,et al.  Improving automotive safety by pairing driver emotion and car voice emotion , 2005, CHI Extended Abstracts.

[6]  Robert Graham,et al.  Comparison of speech input and manual control of in-car devices while on the move , 2000, Personal Technologies.

[7]  Paul Green,et al.  Safety and Usability of Speech Interfaces for In-Vehicle Tasks while Driving: A Brief Literature Review , 2006 .

[8]  H. Bratt,et al.  CHAT: a conversational helper for automotive tasks , 2006, INTERSPEECH.

[9]  David M. Krum,et al.  All roads lead to CHI: interaction in the automobile , 2008, CHI Extended Abstracts.

[10]  Philip Kortum,et al.  HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces , 2008 .

[11]  Antonis A. Argyros,et al.  Vision-based Hand Gesture Recognition for Human-Computer Interaction , 2008 .

[12]  Albrecht Schmidt,et al.  Design space for driver-based automotive user interfaces , 2009, AutomotiveUI.

[13]  Albrecht Schmidt,et al.  Automotive user interfaces: human computer interaction in the car , 2010, CHI Extended Abstracts.

[14]  Michael D. Buhrmester,et al.  Amazon's Mechanical Turk , 2011, Perspectives on psychological science : a journal of the Association for Psychological Science.

[15]  Kristina Höök,et al.  CHI '12 Extended Abstracts on Human Factors in Computing Systems , 2012, CHI 2012.

[16]  Alfred Eckert,et al.  An integrated ADAS solution for pedestrian collision avoidance , 2013 .

[17]  Paul Green,et al.  Development and Evaluation of Automotive Speech Interfaces: Useful Information from the Human Factors and the Related Literature , 2013 .

[18]  Joakim Nivre,et al.  Token and Type Constraints for Cross-Lingual Part-of-Speech Tagging , 2013, TACL.

[19]  D. Norman The Design of Everyday Things: Revised and Expanded Edition , 2013 .

[20]  Helmut Krcmar,et al.  Designing interfaces for multiple-goal environments: Experimental insights from in-vehicle speech interfaces , 2014, TCHI.

[21]  Manfred Tscheligi,et al.  "Dad, Stop Crashing My Car!": Making Use of Probing to Inspire the Design of Future In-Car Interfaces , 2014, AutomotiveUI.

[22]  Clifford Nass,et al.  Partially intelligent automobiles and driving experience at the moment of system transition , 2014, CHI.

[23]  Mohan M. Trivedi,et al.  Hand Gesture Recognition in Real Time for Automotive Interfaces: A Multimodal Vision-Based Approach and Evaluations , 2014, IEEE Transactions on Intelligent Transportation Systems.

[24]  Neda Abasi,et al.  Information systems success: the quest for the dependent variable , 2015 .

[25]  Ronald G. Dreslinski,et al.  Sirius: An Open End-to-End Voice and Vision Personal Assistant and Its Implications for Future Warehouse Scale Computers , 2015, ASPLOS.

[26]  Martin Steinert,et al.  Displayed Uncertainty Improves Driving Experience and Behavior: The Case of Range Anxiety in an Electric Car , 2015, CHI.

[27]  Stanley Peters,et al.  Conversational In-Vehicle Dialog Systems: The past, present, and future , 2016, IEEE Signal Processing Magazine.

[28]  Philip Koopman,et al.  Challenges in Autonomous Vehicle Testing and Validation , 2016 .

[29]  F. Weig Advanced driver-assistance systems: Challenges and opportunities ahead , 2016 .

[30]  Florian Alt,et al.  A design space for conversational in-vehicle information systems , 2017, MobileHCI.

[31]  David L Strayer,et al.  The Smartphone and the Driver’s Cognitive Workload: A Comparison of Apple, Google, and Microsoft’s Intelligent Personal Assistants , 2015, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[32]  Alexandra Neukum,et al.  Increasing anthropomorphism and trust in automated driving functions by adding speech output , 2017, 2017 IEEE Intelligent Vehicles Symposium (IV).

[33]  Wendy Ju,et al.  Toward Measurement of Situation Awareness in Autonomous Vehicles , 2017, CHI.

[34]  Dandu Amarnatha Reddy Vision Based Hand Gesture Recognition for Human Computer Interaction , 2018 .