ReflecTouch: Detecting Grasp Posture of Smartphone Using Corneal Reflection Images

By sensing how a user is holding a smartphone, adaptive user interfaces are possible such as those that automatically switch the displayed content and position of graphical user interface (GUI) components following how the phone is being held. We propose ReflecTouch, a novel method for detecting how a smartphone is being held by capturing images of the smartphone screen reflected on the cornea with a built-in front camera. In these images, the areas where the user places their fingers on the screen appear as shadows, which makes it possible to estimate the grasp posture. Since most smartphones have a front camera, this method can be used regardless of the device model; in addition, no additional sensor or hardware is required. We conducted data collection experiments to verify the classification accuracy of the proposed method for six different grasp postures, and the accuracy was 85%.

[1]  Hyunchul Lim,et al.  HandyTrak: Recognizing the Holding Hand on a Commodity Smartphone from Body Silhouette Images , 2021, UIST.

[2]  Atsushi Nakazawa,et al.  Visual Place Recognition From Eye Reflection , 2021, IEEE Access.

[3]  Shota Yamanaka,et al.  ScraTouch , 2020, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[4]  Edward Lank,et al.  Holding patterns: detecting handedness with a moving smartphone at pickup , 2019, IHM '19.

[5]  Joyce Jiyoung Whang,et al.  SmartGrip: grip sensing system for commodity mobile devices through sound signals , 2019, Personal and Ubiquitous Computing.

[6]  Kentaro Takemura,et al.  Indoor human localization based on the corneal reflection of illumination , 2019, MobileHCI.

[7]  HandSee , 2019, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems.

[8]  Niels Henze,et al.  PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones , 2018, CHI.

[9]  Niels Henze,et al.  Fingers' Range and Comfortable Area for One-Handed Smartphone Interaction Beyond the Touchscreen , 2018, CHI.

[10]  Antonio Krüger,et al.  Eyemirror: mobile calibration-free gaze approximation using corneal imaging , 2017, MUM.

[11]  Jens Grubert,et al.  Towards Around-Device Interaction using Corneal Imaging , 2017, ISS.

[12]  Anne Roudaut,et al.  Understanding Grip Shifts: How Form Factors Impact Hand Movements on Mobile Phones , 2017, CHI.

[13]  Dieter Schmalstieg,et al.  GlassHands: Interaction Around Unmodified Mobile Devices Using Sunglasses , 2016, ISS.

[14]  Antonio Krüger,et al.  "The story of life is quicker than the blink of an eye": using corneal imaging for life logging , 2016, UbiComp Adjunct.

[15]  Takefumi Ogawa,et al.  A Study on Grasp Recognition Independent of Users' Situations Using Built-in Sensors of Smartphones , 2015, UIST.

[16]  Pourang Irani,et al.  Sensing Tablet Grasp + Micro-mobility for Active Reading , 2015, UIST.

[17]  Jungwon Yoon,et al.  Index Finger Zone: Study on Touchable Area Expandability Using Thumb and Index Finger , 2015, MobileHCI Adjunct.

[18]  Atsushi Nakazawa,et al.  Non-calibrated and real-time human view estimation using a mobile corneal imaging camera , 2015, 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[19]  Sebastian Boring,et al.  Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct , 2015, MobileHCI Adjunct.

[20]  Buntarou Shizuki,et al.  Touch & activate: adding interactivity to existing objects using active acoustic sensing , 2013, UIST.

[21]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[22]  Atsushi Nakazawa,et al.  Corneal Imaging Revisited: An Overview of Corneal Reflection Analysis and Applications , 2013, IPSJ Trans. Comput. Vis. Appl..

[23]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[24]  B. Amick,et al.  Musculoskeletal symptoms among mobile hand-held device users and their relationship to device use: A preliminary study in a Canadian university population. , 2011, Applied ergonomics.

[25]  Martin Welk,et al.  Tempest in a Teapot: Compromising Reflections Revisited , 2009, 2009 30th IEEE Symposium on Security and Privacy.

[26]  V. Michael Bove,et al.  Graspables: grasp-recognition as a user interface , 2009, CHI.

[27]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[28]  Shree K. Nayar,et al.  Corneal Imaging System: Environment from Eyes , 2006, International Journal of Computer Vision.

[29]  Kee-Eung Kim,et al.  Hand Grip Pattern Recognition for Mobile User Interfaces , 2006, AAAI.

[30]  Ko Nishino,et al.  The world in an eye [eye image interpretation] , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[31]  Shree K. Nayar,et al.  The World in an Eye , 2004, CVPR.