TeethTap: Recognizing Discrete Teeth Gestures Using Motion and Acoustic Sensing on an Earpiece
暂无分享,去创建一个
Cheng Zhang | Wei Sun | Franklin Mingzhe Li | Songlin Xu | Benjamin Steeper | Feng Tian | Songlin Xu | Wei Sun | Feng Tian | Cheng Zhang | Benjamin Steeper
[1] Ian Oakley,et al. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality , 2017, UIST.
[2] Chulhong Min,et al. eSense: Open Earable Platform for Human Sensing , 2018, SenSys.
[3] Ben Taskar,et al. Non-intrusive tongue machine interface , 2014, CHI.
[4] Atsushi Nishikawa,et al. Earable TEMPO: A Novel, Hands-Free Input Device that Uses the Movement of the Tongue Measured with a Wearable Ear Sensor , 2018, Sensors.
[5] Stephen A. Brewster,et al. The Impact of Encumbrance on Mobile Interactions , 2013, INTERACT.
[6] Chris Harrison,et al. Interferi: Gesture Sensing using On-Body Acoustic Interferometry , 2019, CHI.
[7] Atsushi Nishikawa,et al. Mouthwitch: A Novel Head Mount Type Hands-Free Input Device that Uses the Movement of the Temple to Control a Camera , 2018, Sensors.
[8] Zhen Li,et al. Eyelid Gestures on Mobile Devices for People with Motor Impairments , 2020, ASSETS.
[9] Kai Kunze,et al. Smooth eye movement interaction using EOG glasses , 2016, ICMI.
[10] Suranga Nanayakkara,et al. ChewIt. An Intraoral Interface for Discreet Interactions , 2019, CHI.
[11] Buntarou Shizuki,et al. CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals , 2017, UIST.
[12] Tadashi Okoshi,et al. Situation-Aware Emotion Regulation of Conversational Agents with Kinetic Earables , 2019, 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII).
[13] Andrzej Czyzewski,et al. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition , 2010, Int. J. Comput. Sci. Appl..
[14] Romit Roy Choudhury,et al. EarSense: earphones as a teeth activity sensor , 2020, MobiCom.
[15] Hans-Werner Gellersen,et al. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.
[16] Yen-Chang Chen,et al. Sensor-embedded teeth for oral activity recognition , 2013, ISWC '13.
[17] Seungchul Lee,et al. Automatic Smile and Frown Recognition with Kinetic Earables , 2019, AH.
[18] Jarmo Verho,et al. Capacitive Measurement of Facial Activity Intensity , 2013, IEEE Sensors Journal.
[19] M. Reinders,et al. Multi-Dimensional Dynamic Time Warping for Gesture Recognition , 2007 .
[20] Arthur Prochazka,et al. Tooth-Click Control of a Hands-Free Computer Interface , 2008, IEEE Transactions on Biomedical Engineering.
[21] Carlos Tejada,et al. Bitey: an exploration of tooth click gestures for hands-free user interface control , 2016, MobileHCI.
[22] Gregory D. Abowd,et al. FingerSound , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[23] Kai Kunze,et al. AffectiveWear: towards recognizing affect in real life , 2015, UbiComp/ISWC Adjunct.
[24] Koichi Kuzume. Tooth-touch Sound and Expiration Signal Detection and Its Application in a Mouse Interface Device for Disabled Persons - Realization of a Mouse Interface Device Driven by Biomedical Signals , 2011, PECCS.
[25] Desney S. Tan,et al. Optically sensing tongue gestures for computer input , 2009, UIST '09.
[26] Andreas Bulling,et al. Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers , 2015 .
[27] François Guimbretière,et al. C-Face: Continuously Reconstructing Facial Expressions by Deep Learning Contours of the Face with Ear-mounted Miniature Cameras , 2020, ACM Symposium on User Interface Software and Technology.
[28] Xiaoyu Zhao,et al. Typing with eye-gaze and tooth-clicks , 2012, ETRA.
[29] Meredith Ringel Morris,et al. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities , 2017, CHI.
[30] Tamer Mohamed,et al. TeethClick : Input with Teeth Clacks , 2006 .
[31] Anh Nguyen,et al. TYTH-Typing On Your Teeth: Tongue-Teeth Localization for Human-Computer Interface , 2018, MobiSys.
[32] Arthur Prochazka,et al. Evaluation of Tooth-Click Triggering and Speech Recognition in Assistive Technology for Computer Access , 2010, Neurorehabilitation and neural repair.
[33] Yusuke Sugano,et al. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.
[34] Tao Wang,et al. Auracle: Detecting Eating Episodes with an Ear-mounted Sensor , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[35] Sadaoki Furui,et al. Speech recognition technology in the ubiquitous/wearable computing environment , 2000, 2000 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.00CH37100).