iWink: Exploring Eyelid Gestures on Mobile Devices
暂无分享,去创建一个
Ying Han | Khai N. Truong | Zhen Li | Mingming Fan | K. Truong | Mingming Fan | Zhen Li | Ying Han
[1] Robert J. K. Jacob,et al. What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.
[2] Paul Lukowicz,et al. In the blink of an eye: combining head motion and eye blink frequency for activity recognition with Google Glass , 2014, AH.
[3] Fei Yang,et al. Recognizing eyebrow and periodic head gestures using CRFs for non-manual grammatical marker detection in ASL , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).
[4] Hans-Werner Gellersen,et al. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.
[5] Pawel Strumillo,et al. Eye-blink detection system for human–computer interaction , 2011, Universal Access in the Information Society.
[6] L. L. Palmer,et al. Inability to Wink an Eye and Eye Dominance , 1976, Perceptual and motor skills.
[7] I. Scott MacKenzie,et al. BlinkWrite: efficient text entry using eye blinks , 2011, Universal Access in the Information Society.
[8] Robin Shaw,et al. The eye wink control interface: using the computer to provide the severely disabled with increased flexibility and comfort , 1990, [1990] Proceedings. Third Annual IEEE Symposium on Computer-Based Medical Systems.
[9] Johannes Schöning,et al. Falling asleep with Angry Birds, Facebook and Kindle: a large scale study on mobile application usage , 2011, Mobile HCI.
[10] Ravin Balakrishnan,et al. Porous Interfaces for Small Screen Multitasking using Finger Identification , 2016, UIST.
[11] Shumin Zhai,et al. Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.
[12] Marcos Serrano,et al. Exploring the use of hand-to-face input for interacting with head-worn displays , 2014, CHI.
[13] Chi-Ho Chan,et al. MouthType: text entry by hand and mouth , 2004, CHI EA '04.
[14] Shwetak N. Patel,et al. Tongue-in-Cheek: Using Wireless Signals to Enable Non-Intrusive and Flexible Facial Gestures Detection , 2015, CHI.
[15] Tovi Grossman,et al. BlyncSync: Enabling Multimodal Smartwatch Gestures with Synchronous Touch and Blink , 2020, CHI.
[16] Tovi Grossman,et al. No Need to Stop What You're Doing: Exploring No-Handed Smartwatch Interaction , 2017, Graphics Interface.
[17] Chun Yu,et al. Clench Interface: Novel Biting Input Techniques , 2019, CHI.
[18] J. Stern,et al. The endogenous eyeblink. , 1984, Psychophysiology.
[19] Margrit Betke,et al. Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.
[20] Mike Y. Chen,et al. EyeExpression: exploring the use of eye expressions as hands-free input for virtual and augmented reality devices , 2017, VRST.
[21] Kari-Jouko Räihä,et al. Simple gaze gestures and the closure of the eyes as an interaction technique , 2012, ETRA.
[22] I. Scott MacKenzie,et al. BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.
[23] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[24] Zhen Li,et al. Eyelid Gestures on Mobile Devices for People with Motor Impairments , 2020, ASSETS.
[25] Yuta Sugiura,et al. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display , 2017, VRST.
[26] Sunghoon Kwon,et al. EOG-based glasses-type wireless mouse for the disabled , 1999, Proceedings of the First Joint BMES/EMBS Conference. 1999 IEEE Engineering in Medicine and Biology 21st Annual Conference and the 1999 Annual Fall Meeting of the Biomedical Engineering Society (Cat. N.
[27] Mike Y. Chen,et al. Wink it: investigating wink-based interactions for smartphones , 2018, MobileHCI Adjunct.
[28] Roel Vertegaal,et al. Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.
[29] Antonio Krüger,et al. Back to the app: the costs of mobile application interruptions , 2012, Mobile HCI.
[30] Hans-Werner Gellersen,et al. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.
[31] Buntarou Shizuki,et al. CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals , 2017, UIST.
[32] Arie E. Kaufman,et al. An eye tracking computer user interface , 1993, Proceedings of 1993 IEEE Research Properties in Virtual Reality Symposium.
[33] Hans-Werner Gellersen,et al. Gaze-touch: combining gaze with multi-touch for interaction on the same surface , 2014, UIST.
[34] Florian Alt,et al. GazeTouchPIN: protecting sensitive data on mobile devices using secure multimodal authentication , 2017, ICMI.
[35] Fabian Hemmert,et al. Perspective change: a system for switching between on-screen views by closing one eye , 2008, AVI '08.
[36] Meredith Ringel Morris,et al. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities , 2017, CHI.
[37] Dariusz Sawicki,et al. Blink and wink detection as a control tool in multimodal interaction , 2018, Multimedia Tools and Applications.
[38] Yanxia Zhang,et al. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze , 2015, UIST.
[39] Hojun Lee,et al. Hand-free natural user interface for VR HMD with IR based facial gesture tracking sensor , 2017, VRST.
[40] Daniel J. Wigdor,et al. Palpebrae superioris: exploring the design space of eyelid gestures , 2015, Graphics Interface.
[41] Emiliano Miluzzo,et al. EyePhone: activating mobile phones with your eyes , 2010, MobiHeld '10.