Effects of Non-Speech Auditory Cues on Control Transition Behaviors in Semi-Automated Vehicles: Empirical Study, Modeling, and Validation
暂无分享,去创建一个
Myounghoon Jeon | Yiqi Zhang | Sangjin Ko | Kyle Kutchek | M. Jeon | Yiqi Zhang | Sangjin Ko | Kyle Kutchek
[1] Joost C. F. de Winter,et al. Determinants of take-over time from automated driving: A meta-analysis of 129 studies , 2019, Transportation Research Part F: Traffic Psychology and Behaviour.
[2] Klaus Bengler,et al. “Take over!” How long does it take to get the driver back into the loop? , 2013 .
[3] Kathrin Zeeb,et al. What determines the take-over time? An integrated model approach of driver take-over after automated driving. , 2015, Accident; analysis and prevention.
[4] Carryl L. Baldwin,et al. Perceived urgency mapping across modalities within a driving context. , 2014, Applied Ergonomics.
[5] Harsh Sanghavi,et al. Multimodal Takeover Request Displays for Semi-automated Vehicles: Focused on Spatiality and Lead Time , 2021, HCI.
[6] Omer Tsimhoni,et al. Haptic seat for automated driving: preparing the driver to take control effectively , 2015, AutomotiveUI.
[7] Lisa J Molnar. Age-Related Differences in Driver Behavior Associated with Automated Vehicles and the Transfer of Control between Automated and Manual Control: A Simulator Evaluation , 2017 .
[8] John Richardson,et al. Alarm timing, trust and driver expectation for forward collision warning systems. , 2006, Applied ergonomics.
[9] P A Hancock,et al. Alarm effectiveness in driver-centred collision-warning systems. , 1997, Ergonomics.
[10] Peter J Seiler,et al. Development of a collision avoidance system , 1998 .
[11] Myounghoon Jeon,et al. Modeling the effects of auditory display takeover requests on drivers' behavior in autonomous vehicles , 2019, AutomotiveUI.
[12] Frank E. Pollick,et al. Language-based multimodal displays for the handover of control in autonomous cars , 2015, AutomotiveUI.
[13] Myounghoon Jeon,et al. Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and Workload , 2015, Int. J. Hum. Comput. Interact..
[14] W. Dorner,et al. Warning Apps for Road Safety: A Technological and Economical Perspective for Autonomous Driving – The Warning Task in the Transition from Human Driver to Automated Driving , 2021, Int. J. Hum. Comput. Interact..
[15] Carryl L. Baldwin,et al. Validation of Essential Acoustic Parameters for Highly Urgent In-Vehicle Collision Warnings , 2018, Hum. Factors.
[16] Pasi Lautala,et al. Effects of Auditory Display Types and Acoustic Variables on Subjective Driver Assessment in a Rail Crossing Context , 2021, Transportation Research Record: Journal of the Transportation Research Board.
[17] Wendy A. Rogers,et al. Warning Research: An Integrative Perspective , 2000, Hum. Factors.
[18] Shinichiro Horiuchi,et al. An Analytical Approach to the Prediction of Handling Qualities of Vehicles With Advanced Steering Control System Using Multi-Input Driver Model , 2000 .
[19] Changxu Wu,et al. Effects of lead time of verbal collision warning messages on driving behavior in connected vehicle settings. , 2016, Journal of safety research.
[20] Toshio Ito,et al. Time Required for Take-over from Automated to Manual Driving , 2016 .
[21] Changxu Wu,et al. Learn to Integrate Mathematical Models in Human Performance Modeling , 2017 .
[22] Dario D. Salvucci,et al. Distract-R: rapid prototyping and evaluation of in-vehicle interfaces , 2005, CHI.
[23] Stefano Baldan,et al. Report on the In-vehicle Auditory Interactions Workshop: Taxonomy, Challenges, and Approaches , 2015, AutomotiveUI 2015.
[24] J Edworthy,et al. Improving Auditory Warning Design: Quantifying and Predicting the Effects of Different Warning Parameters on Perceived Urgency , 1993, Human factors.
[25] Tal Oron-Gilad,et al. The Effects of Continuous Driving-Related Feedback on Drivers’ Response to Automation Failures , 2017 .
[26] Daniel V. McGehee,et al. Collision Warning Timing, Driver Distraction, and Driver Response to Imminent Rear-End Collisions in a High-Fidelity Driving Simulator , 2002, Hum. Factors.
[27] John Richardson,et al. The effect of alarm timing on driver behaviour: an investigation of differences in driver trust and response to alarms according to alarm timing , 2004 .
[28] Myounghoon Jeon,et al. Blueprint of the Auditory Interactions in Automated Vehicles: Report on the Workshop and Tutorial , 2017, AutomotiveUI.
[29] Mowei Shen,et al. Modeling the development of vehicle lateral control skills in a cognitive architecture , 2015 .
[30] Heejin Jeong,et al. Modeling of Stimulus-Response Secondary Tasks with Different Modalities while Driving in a Computational Cognitive Architecture , 2017 .
[31] Omer Tsimhoni,et al. Using a Vibrotactile Seat for Facilitating the Handover of Control during Automated Driving , 2017, Int. J. Mob. Hum. Comput. Interact..
[32] Jordan Navarro,et al. Human–machine interaction theories and lane departure warnings , 2017 .
[33] Yi-Li Liu,et al. Modeling driver car-following based on the queuing network cognitive architecture , 2009, 2009 International Conference on Machine Learning and Cybernetics.
[34] Yili Liu,et al. Modeling Steering Using the Queueing Network — Model Human Processor (QN-MHP) , 2003 .
[35] Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles , 2022 .
[36] Chin-Teng Lin,et al. Assessing Effectiveness of Various Auditory Warning Signals in Maintaining Drivers' Attention in Virtual Reality-Based Driving Environments , 2009, Perceptual and motor skills.
[37] Judy Edworthy. Does sound help us to work better with machines? A commentary on Rauterberg's paper ‘About the importance of auditory alarms during the operation of a plant simulator’ , 1998 .
[38] Changxu Wu,et al. Mathematical Modeling of the Effects of Speech Warning Characteristics on Human Performance and Its Application in Transportation Cyberphysical Systems , 2016, IEEE Transactions on Intelligent Transportation Systems.
[39] Harsh Sanghavi,et al. Effects of Anger and Display Urgency on Takeover Performance in Semi-automated Vehicles , 2020, AutomotiveUI.
[40] Yili Liu,et al. Queuing Network Modeling of Driver Lateral Control With or Without a Cognitive Distraction Task , 2012, IEEE Transactions on Intelligent Transportation Systems.
[41] Catherine M. Burns,et al. Sonification Discriminability and Perceived Urgency , 2012 .
[42] Frank E. Ritter,et al. Providing user models direct access to interfaces: an exploratory study of a simple interface with implications for HRI and HCI , 2006, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.
[43] Neville A. Stanton,et al. Rolling Out the Red (and Green) Carpet: Supporting Driver Decision Making in Automation-to-Manual Transitions , 2019, IEEE Transactions on Human-Machine Systems.
[44] Changxu Wu,et al. Mathematical Modeling of Driver Speed Control With Individual Differences , 2013, IEEE Transactions on Systems, Man, and Cybernetics: Systems.
[45] J C F de Winter,et al. Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat. , 2017, Accident; analysis and prevention.
[46] Frank E. Pollick,et al. Using Multimodal Displays to Signify Critical Handovers of Control to Distracted Autonomous Car Drivers , 2017, Int. J. Mob. Hum. Comput. Interact..
[47] Frederik Diederichs,et al. Take-Over Requests for Automated Driving , 2015 .
[48] John Richardson,et al. The influence of alarm timing on braking response and driver trust in low speed driving , 2005 .
[49] Yili Liu,et al. Queuing Network Modeling of Driver Workload and Performance , 2006, IEEE Transactions on Intelligent Transportation Systems.
[50] Myounghoon Jeon,et al. Auditory Displays for Take-Over in Semi-automated Vehicles , 2018, HCI.
[51] Myounghoon Jeon. Multimodal Displays for Take-over in Level 3 Automated Vehicles while Playing a Game , 2019, CHI Extended Abstracts.
[52] Ulrike Schmuntzsch,et al. The warning glove - development and evaluation of a multimodal action-specific warning prototype. , 2014, Applied ergonomics.
[53] Yili Liu,et al. Queuing network modeling of the psychological refractory period (PRP). , 2008, Psychological review.
[54] Simon Farrell,et al. Computational Modeling in Cognition: Principles and Practice , 2010 .
[55] Christopher B Mayhorn,et al. Special issue on warnings: advances in delivery, application, and methods. , 2014, Applied ergonomics.
[56] Kelly Funkhouser,et al. Putting the Brakes on Autonomous Vehicle Control , 2016 .
[57] Changxu Wu,et al. Modeling the Effects of Warning Lead Time, Warning Reliability and Warning Style on Human Performance Under Connected Vehicle Settings , 2018 .
[58] Christopher D. Wickens,et al. Multiple resources and performance prediction , 2002 .
[59] Christopher D. Wickens,et al. Multiple Resources and Mental Workload , 2008, Hum. Factors.
[60] Denis McKeown,et al. Candidates for within-vehicle auditory displays , 2005 .
[61] Changxu Wu,et al. The Five Key Questions of Human Performance Modeling. , 2016, International journal of industrial ergonomics.
[62] Myounghoon Jeon. Auditory User Interface Design , 2015 .
[63] Klaus Bengler,et al. Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop. , 2017, Applied ergonomics.
[64] J Edworthy,et al. Improving Auditory Warning Design: Relationship between Warning Sound Parameters and Perceived Urgency , 1991, Human factors.
[65] Lewis L. Chuang,et al. Looming Auditory Collision Warnings for Semi-Automated Driving: An ERP Study , 2018, AutomotiveUI.
[66] Ling Chen,et al. A Comparative Study of Sonification Methods to Represent Distance and Forward-Direction in Pedestrian Navigation , 2014, Int. J. Hum. Comput. Interact..
[67] Dario D. Salvucci. Modeling Driver Behavior in a Cognitive Architecture , 2006, Hum. Factors.
[68] Carryl L. Baldwin,et al. Multimodal urgency coding: auditory, visual, and tactile parameters and their impact on perceived urgency. , 2012, Work.
[69] Judy Edworthy,et al. Quantifying the perceived urgency of auditory warnings , 1989 .