Interactive gaze and finger controlled HUD for cars
暂无分享,去创建一个
Pradipta Biswas | Vinay Krishna Sharma | Aparna Ramakrishnan | Modiksha Madan | Gowdham Prabhakar | L. R. D. Murthy | Sachin Deshmukh
[1] Pradipta Biswas. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments , 2016, SpringerBriefs in Computer Science.
[2] Carl Jörgen Normark,et al. Design and Evaluation of a Touch-Based Personalizable In-Vehicle User Interface , 2015, Int. J. Hum. Comput. Interact..
[3] Christof Lutteroth,et al. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative , 2015, UIST.
[4] Shumin Zhai,et al. The metropolis keyboard - an exploration of quantitative techniques for virtual keyboard design , 2000, UIST '00.
[5] Pradipta Biswas,et al. A new interaction technique involving eye gaze tracker and scanning system , 2013, ETSA '13.
[6] Pradipta Biswas,et al. Estimating Pilots’ Cognitive Load From Ocular Parameters Through Simulation and In-Flight Studies , 2019, Journal of eye movement research.
[7] Angela Castronovo,et al. Generating a Personalized UI for the Car: A User-Adaptive Rendering Architecture , 2013, UMAP.
[8] Wonil Hwang,et al. Haptic Seat Interfaces for Driver Information and Warning Systems , 2011, Int. J. Hum. Comput. Interact..
[9] Yun Fu,et al. hMouse: Head Tracking Driven Virtual Computer Mouse , 2007, 2007 IEEE Workshop on Applications of Computer Vision (WACV '07).
[10] Luis Figueiredo,et al. Hands-free interaction with a computer and other technologies , 2009, Universal Access in the Information Society.
[11] Per Ola Kristensson,et al. Subtle gaze-dependent techniques for visualising display changes in multi-display environments , 2013, IUI '13.
[12] Bruce N. Walker,et al. A Multimodal Air Gesture Interface for In Vehicle Menu Navigation , 2014, AutomotiveUI.
[13] Justin S. Graving,et al. Human Factors Design Guidance for Level 2 and Level 3 Automated Driving Concepts , 2018 .
[14] Min Chen,et al. Glyph-based Visualization: Foundations, Design Guidelines, Techniques and Applications , 2013, Eurographics.
[15] Gerhard Rigoll,et al. Gaze-based interaction on multiple displays in an automotive environment , 2011, 2011 IEEE International Conference on Systems, Man, and Cybernetics.
[16] Patrick Langdon,et al. Multimodal Intelligent Eye-Gaze Tracking System , 2015, Int. J. Hum. Comput. Interact..
[17] Mark Ashdown,et al. Combining head tracking and mouse input for a GUI on multiple monitors , 2005, CHI Extended Abstracts.
[18] Mohan M. Trivedi,et al. Hand Gesture Recognition in Real Time for Automotive Interfaces: A Multimodal Vision-Based Approach and Evaluations , 2014, IEEE Transactions on Intelligent Transportation Systems.
[19] Gowdham Prabhakar,et al. Detecting drivers’ cognitive load from saccadic intrusion , 2018 .
[20] LOW-COST NON-IMAGING EYE TRACKER SYSTEM FOR COMPUTER CONTROL REFERENCE TO PRIORAPPLICATIONS , 2017 .
[21] Hans-Werner Gellersen,et al. Gaze and Touch Interaction on Tablets , 2016, UIST.
[22] Sandra P Marshall,et al. Identifying cognitive state from eye metrics. , 2007, Aviation, space, and environmental medicine.
[23] Kyungdoh Kim,et al. Utilization of Visual Information Perception Characteristics to Improve Classification Accuracy of Driver’s Visual Search Intention for Intelligent Vehicle , 2015, Int. J. Hum. Comput. Interact..
[24] Yanxia Zhang,et al. SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.
[25] Thorsten O. Zander,et al. Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..
[26] Ronald R. Mourant,et al. Luminance Specifications for Automobile Instrument Panels , 1976, Human factors.
[27] Albrecht Schmidt,et al. Making use of drivers' glances onto the screen for explicit gaze-based interaction , 2010, AutomotiveUI.
[28] Johannes Schöning,et al. Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input , 2015, SUI.
[29] Johannes Schöning,et al. Combining Direct and Indirect Touch Input for Interactive Desktop Workspaces using Gaze Input , 2015 .