Automotive multimodal human-machine interface
暂无分享,去创建一个
[1] Jonathan Dobres,et al. Effects of an 'Expert Mode' Voice Command System on Task Performance, Glance Behavior & Driver Physiology , 2014, AutomotiveUI.
[2] S N Roscoe,et al. Eye accommodation to head-up virtual images. , 1988, Human factors.
[3] Robert Grimm,et al. Systems directions for pervasive computing , 2001, Proceedings Eighth Workshop on Hot Topics in Operating Systems.
[4] Peter Bengtsson,et al. Haptic, visual and cross-modal perception of interface information , 2007 .
[5] M. Kruijshaar,et al. The association between HIV and antituberculosis drug resistance , 2008, European Respiratory Journal.
[6] Matthew R. E. Romoser. An Autonomous Intelligent Driving Simulation Tutor for Driver Training and Remediation: A Concept Paper , 2017 .
[7] Sara Bongartz,et al. International evaluation of NLU benefits in the domain of in-vehicle speech dialog systems , 2013, AutomotiveUI.
[8] Anita Gärling,et al. Do redundant head-up and head-down display configurations cause distractions? , 2017 .
[9] Arjan Kuijper,et al. Capacitive proximity sensing in smart environments , 2015, J. Ambient Intell. Smart Environ..
[10] Mark Vollrath,et al. Accident Analysis and Prevention , 2009 .
[11] Dirk Schnelle,et al. Context aware voice user interfaces for workflow support: voice based support for mobile workers , 2008 .
[12] Terry Winograd,et al. Procedures As A Representation For Data In A Computer Program For Understanding Natural Language , 1971 .
[13] Sharon L. Oviatt,et al. Human-centered design meets cognitive load theory: designing interfaces that help people think , 2006, MM '06.
[14] John Sweller,et al. Cognitive Load During Problem Solving: Effects on Learning , 1988, Cogn. Sci..
[15] Klaus Bengler,et al. Eye Gaze Studies Comparing Head-Up and Head-Down Displays in Vehicles , 2007, 2007 IEEE International Conference on Multimedia and Expo.
[16] Tom Wellings,et al. Assessing subjective response to haptic feedback in automotive touchscreens , 2009, AutomotiveUI.
[17] Michael F. McTear,et al. Book Review , 2005, Computational Linguistics.
[18] H. Bülthoff,et al. Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.
[19] Elena Mugellini,et al. In-Vehicle Natural Interaction based on Electromyography , 2012 .
[20] Ian Spence,et al. The Commingled Division of Visual Attention , 2015, PloS one.
[21] Andrew L. Kun,et al. Estimating cognitive load using remote eye tracking in a driving simulator , 2010, ETRA.
[22] Christopher J.D. Patten. Cognitive Workload and the Driver : Understanding the Effects of Cognitive Workload on Driving from a Human Information Processing Perspective , 2007 .
[23] Keith Duncan,et al. Cognitive Engineering , 2017, Encyclopedia of GIS.
[24] David L. Strayer,et al. Visual and Cognitive Demands of Using In-Vehicle Infotainment Systems , 2017 .
[25] Joshua D. Hoffman,et al. Collision warning design to mitigate driver distraction , 2004, CHI.
[26] J. Sweller. Element Interactivity and Intrinsic, Extraneous, and Germane Cognitive Load , 2010 .
[27] Norman H. Villaroman,et al. Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor , 2011, SIGITE '11.
[28] Joel M. Cooper,et al. Measuring Cognitive Distraction in the Automobile III: A Comparison of Ten 2015 In-Vehicle Information Systems , 2015 .
[29] Lucila Ohno-Machado,et al. Natural language processing: an introduction , 2011, J. Am. Medical Informatics Assoc..
[30] Michael Weber,et al. Autonomous driving: investigating the feasibility of car-driver handover assistance , 2015, AutomotiveUI.
[31] Alois Ferscha,et al. Standardization of the in-car gesture interaction space , 2013, AutomotiveUI.
[32] Albrecht Schmidt,et al. Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.
[33] C Baber,et al. An experimental comparison of test and symbols for in-car reconfigurable displays. , 1992, Applied ergonomics.
[34] Andreas Butz,et al. What you see is what you touch: visualizing touch screen interaction in the head-up display , 2014, Conference on Designing Interactive Systems.
[35] Christopher D. Wickens,et al. Multiple resources and performance prediction , 2002 .
[36] Thomas A. Dingus,et al. The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data , 2006 .
[37] Abdulmotaleb El-Saddik,et al. Motion-path based in car gesture control of the multimedia devices , 2011, DIVANet '11.
[38] Richard Allen Young. Self-Regulation Reduces Crash Risk from the Attentional Effects of Cognitive Load from Auditory-Vocal Tasks , 2014 .
[39] Noah J. Goodall,et al. Machine Ethics and Automated Vehicles , 2020, ArXiv.
[40] Stefan Radomski,et al. Formal verification of multimodal dialogs in pervasive environments , 2015 .
[41] Christian A. Müller,et al. Geremin": 2D microgestures for drivers based on electric field sensing , 2011, IUI '11.
[42] Jock D. Mackinlay,et al. A morphological analysis of the design space of input devices , 1991, TOIS.
[43] C. A. Pickering,et al. A Review of Automotive Human Machine Interface Technologies and Techniques to Reduce Driver Distraction , 2007 .
[44] Steve Summerskill,et al. Feeling your way home: the use of haptic interfaces within cars to make safety pleasurable , 2003 .
[45] Samarjit Chakraborty,et al. AR-IVI — Implementation of In-Vehicle Augmented Reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[46] Dong Yu,et al. An introduction to voice search , 2008, IEEE Signal Processing Magazine.
[47] Richard A. Young,et al. Self-regulation minimizes crash risk from attentional effects of cognitive load during auditory-vocal tasks , 2014 .
[48] John D. Lee,et al. Defining Driver Distraction , 2009 .
[49] Arthur D. Fisk,et al. Touch a Screen or Turn a Knob: Choosing the Best Device for the Job , 2005, Hum. Factors.
[50] Ernst Pöppel,et al. Effects of display position of a visual in-vehicle task on simulated driving. , 2006, Applied ergonomics.
[51] Albrecht Schmidt,et al. Design space for driver-based automotive user interfaces , 2009, AutomotiveUI.
[52] Albrecht Schmidt,et al. Multimodal interaction in the car: combining speech and gestures on the steering wheel , 2012, AutomotiveUI.
[53] Radu-Daniel Vatavu,et al. User-defined gestures for free-hand TV control , 2012, EuroITV.
[54] Mario J. Enriquez,et al. A pneumatic tactile alerting system for the driving environment , 2001, PUI '01.
[55] Omer Tsimhoni,et al. Address Entry While Driving: Speech Recognition Versus a Touch-Screen Keyboard , 2004, Hum. Factors.
[56] R. Engle,et al. Is working memory capacity task dependent , 1989 .
[57] Dario D. Salvucci. Predicting the effects of in-car interface use on driver performance: an integrated model approach , 2001, Int. J. Hum. Comput. Stud..
[58] Joëlle Coutaz,et al. A design space for multimodal systems: concurrent processing and data fusion , 1993, INTERCHI.
[59] F. Paas. Training strategies for attaining transfer of problem-solving skill in statistics: A cognitive-load approach. , 1992 .
[60] Hansjörg Hofmann. Intuitive speech interface technology for information exchange tasks , 2015 .
[61] C. G. Keller,et al. Will the Pedestrian Cross? A Study on Pedestrian Path Prediction , 2014, IEEE Transactions on Intelligent Transportation Systems.
[62] Julia Niemann,et al. Designing Speech Output for In-car Infotainment Applications Based on a Cognitive Model of Attention Allocation , 2013 .
[63] Michael A. Regan,et al. Driver distraction: A review of the literature , 2003 .
[64] Daniel G. Bobrow,et al. Natural Language Input for a Computer Problem Solving System , 1964 .
[65] Daniel J. Wigdor,et al. Direct-touch vs. mouse input for tabletop displays , 2007, CHI.
[66] N. Mcbrien,et al. The influence of cognition and age on accommodation, detection rate and response times when using a car head‐up display (HUD) , 1998, Ophthalmic & physiological optics : the journal of the British College of Ophthalmic Opticians.
[67] Luke Fletcher,et al. Correlating driver gaze with the road scene for driver assistance systems , 2005, Robotics Auton. Syst..
[68] Neville A Stanton,et al. To twist or poke? A method for identifying usability issues with the rotary controller and touch screen for control of in-vehicle information systems , 2011, Ergonomics.
[69] Mikael B. Skov,et al. You can touch, but you can't look: interacting with in-vehicle systems , 2008, CHI.
[70] J. Dusek,et al. Impact of Incremental Increases in Cognitive Workload on Physiological Arousal and Performance in Young Adult Drivers , 2009 .
[71] Hyogon Kim,et al. Do you see what I see: towards a gaze-based surroundings query processing system , 2015, AutomotiveUI.
[72] Sharon L. Oviatt,et al. Advances in Robust Multimodal Interface Design , 2003, IEEE Computer Graphics and Applications.