Detecting users handedness for ergonomic adaptation of mobile user interfaces

Often, we operate mobile devices using only one hand. The hand thereby serves two purposes: holding the device and operating the touch screen with the thumb. The current trend of increasing screen sizes however, makes it close to impossible to reach all parts of the screen (especially the top area) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and two-handed usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k-nearest neighbor comparison of the internal sensor readings of the smartphone during the unlocking process.

[1]  Kee-Eung Kim,et al.  Hand Grip Pattern Recognition for Mobile User Interfaces , 2006, AAAI.

[2]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[3]  Simon Rogers,et al.  28 frames later: predicting screen touches from back-of-device grip changes , 2014, CHI.

[4]  Joanna Bergstrom-Lehtovirta,et al.  Modeling the functional area of the thumb on mobile touchscreen surfaces , 2014, CHI.

[5]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[6]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[7]  Matthieu B. Trudeau,et al.  Thumb motor performance varies with thumb and wrist posture during single-handed mobile phone use. , 2012, Journal of biomechanics.

[8]  Xiang-Yang Li,et al.  SilentSense: silent user identification via touch and movement behavioral biometrics , 2013, MobiCom.

[9]  Xiang 'Anthony' Chen,et al.  The fat thumb: using the thumb's contact size for single-handed mobile interaction , 2012, Mobile HCI.

[10]  Markus Löchtefeld,et al.  Evaluation of hybrid front- and back-of-device interaction on mobile devices , 2013, MUM.

[11]  Heinrich Hußmann,et al.  Touch me once and i know it's you!: implicit authentication based on touch screen patterns , 2012, CHI.

[12]  Geehyuk Lee,et al.  Interaction techniques for unreachable objects on the touchscreen , 2012, OZCHI.

[13]  Hosub Lee,et al.  Fit your hand: personalized user interface considering physical attributes of mobile device users , 2011, UIST '11 Adjunct.

[14]  V. Michael Bove,et al.  The bar of soap: a grasp recognition system implemented in a multi-functional handheld device , 2008, CHI Extended Abstracts.

[15]  Vimala Balakrishnan,et al.  A study of the effect of thumb sizes on mobile phone texting satisfaction , 2008 .

[16]  Markus Löchtefeld,et al.  A user-specific machine learning approach for improving touch accuracy on mobile devices , 2012, UIST '12.