A Design Space for User Interface Elements using Finger Orientation Input

Despite touchscreens being used by billions of people every day, today’s touch-based interactions are limited in their expressiveness as they mostly reduce the rich information of the finger down to a single 2D point. Researchers have proposed using finger orientation as input to overcome these limitations, adding two extra dimensions – the finger’s pitch and yaw angles. While finger orientation has been studied in-depth over the last decade, we describe an updated design space. Therefore, we present expert interviews combined with a literature review to describe the wide range of finger orientation input opportunities. First, we present a comprehensive set of finger orientation input enhanced user interface elements supported by expert interviews. Second, we extract design implications as a result of the additional input parameters. Finally, we introduce a design space for finger orientation input.

[1]  Niels Henze,et al.  Designing finger orientation input for mobile touchscreens , 2018, MobileHCI.

[2]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[3]  Jan O. Borchers,et al.  Use the Force Picker, Luke: Space-Efficient Value Input on Force-Sensitive Mobile Touchscreens , 2018, CHI.

[4]  Daniel Vogel,et al.  Characterizing Finger Pitch and Roll Orientation During Atomic Touch Actions , 2018, CHI.

[5]  Patrick Olivier,et al.  Expressy: Using a Wrist-worn Inertial Measurement Unit to Add Expressiveness to Touch-based Interactions , 2016, CHI.

[6]  Senaka Buthpitiya,et al.  Bodyprint: Biometric User Identification on Mobile Devices Using the Capacitive Touchscreen to Scan Body Parts , 2015, CHI.

[7]  Karthik Ramani,et al.  Extended multitouch: recovering touch posture and differentiating users using a depth camera , 2012, UIST.

[8]  Ian Oakley,et al.  The Flat Finger: Exploring Area Touches on Smartwatches , 2016, CHI.

[9]  Sven Mayer,et al.  Force Touch Detection on Capacitive Sensors using Deep Neural Networks , 2019, MobileHCI.

[10]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[11]  Ravin Balakrishnan,et al.  DualKey: Miniature Screen Text Entry via Finger Identification , 2016, CHI.

[12]  Shahzad Malik,et al.  Visual touchpad: a two-handed gestural input device , 2004, ICMI '04.

[13]  Jun Rekimoto,et al.  Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces , 2010, ITS '10.

[14]  Jock D. Mackinlay,et al.  The design space of input devices , 1990, CHI '90.

[15]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[16]  Jonna Häkkilä,et al.  Exploring finger specific touch screen interaction for mobile phone user interfaces , 2014, OZCHI.

[17]  Tony DeRose,et al.  Proton++: a customizable declarative multitouch framework , 2012, UIST.

[18]  Stephen A. Brewster,et al.  A study of on-device gestures , 2012, Mobile HCI.

[19]  Niels Henze,et al.  Estimating the Finger Orientation on Capacitive Touchscreens Using Convolutional Neural Networks , 2017, ISS.

[20]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[21]  Amy Banic,et al.  3DTouch: A wearable 3D input device for 3D applications , 2015, 2015 IEEE Virtual Reality (VR).

[22]  SketchADoodle: Touch-surface Multi-stroke Gesture Handling by Bézier Curves , 2020, Proc. ACM Hum. Comput. Interact..

[23]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[24]  Robert Xiao,et al.  CapAuth: Identifying and Differentiating User Handprints on Commodity Capacitive Touchscreens , 2015, ITS.

[25]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[26]  Carl Gutwin,et al.  Improving Discoverability and Expert Performance in Force-Sensitive Text Selection for Touch Devices with Mode Gauges , 2018, CHI.

[27]  Elisabeth André,et al.  Usage and Recognition of Finger Orientation for Multi-Touch Tabletop Interaction , 2011, INTERACT.

[28]  Niels Henze,et al.  PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones , 2018, CHI.

[29]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[30]  Saul Greenberg,et al.  Multimodal multiplayer tabletop gaming , 2007, CIE.

[31]  Thomas P. Moran,et al.  Questions, Options, and Criteria: Elements of Design Space Analysis , 1991, Hum. Comput. Interact..

[32]  Xing-Dong Yang,et al.  See me, see you: a lightweight method for discriminating user touches on tabletop displays , 2012, CHI.

[33]  Niels Henze,et al.  How to communicate new input techniques , 2018, NordiCHI.

[34]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[35]  E. Coiera Interaction Design , 2002 .

[36]  Robert Xiao,et al.  TouchTools: leveraging familiarity and skill with physical tools to augment touch interaction , 2014, CHI.

[37]  Niels Henze,et al.  Investigating the feasibility of finger identification on capacitive touchscreens using deep learning , 2019, IUI.

[38]  Chen Liang,et al.  HandSee: Enabling Full Hand Interaction on Smartphone with Front Camera-based Stereo Vision , 2019, CHI.

[39]  Sebastian Boring,et al.  Designing user-, hand-, and handpart-aware tabletop interactions with the TouchID toolkit , 2011, ITS '11.

[40]  Vadim Zaliva 3D finger posture detection and gesture recognition on touch surfaces , 2012, 2012 12th International Conference on Control Automation Robotics & Vision (ICARCV).

[41]  James L. Crowley,et al.  Finger Tracking as an Input Device for Augmented Reality , 1995 .

[42]  Gunnar Harboe,et al.  Real-World Affinity Diagramming Practices: Bridging the Paper-Digital Gap , 2015, CHI.

[43]  D. Norman The Design of Everyday Things: Revised and Expanded Edition , 2013 .

[44]  Niels Henze,et al.  Feasibility analysis of detecting the finger orientation with depth cameras , 2017, MobileHCI.

[45]  Ravin Balakrishnan,et al.  Porous Interfaces for Small Screen Multitasking using Finger Identification , 2016, UIST.

[46]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[47]  M. Sheelagh T. Carpendale,et al.  Fluid integration of rotation and translation , 2005, CHI.

[48]  Eric Lecolinet,et al.  MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb , 2009, CHI.

[49]  Maribeth Back,et al.  PointPose: finger pose estimation for touch input on mobile devices using a depth sensor , 2013, ITS.

[50]  Niels Henze,et al.  Understanding the ergonomic constraints in designing for touch surfaces , 2017, MobileHCI.

[51]  Fabrice Matulic,et al.  Hand Contact Shape Recognition for Posture-Based Tabletop Widgets and Interaction , 2017, ISS.

[52]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[53]  Guy Weinzapfel,et al.  One-point touch input of vector information for computer displays , 1978, SIGGRAPH '78.

[54]  Christian Jacquemin,et al.  Follow My Finger Navigation , 2009, INTERACT.

[55]  Jock D. Mackinlay,et al.  A Semantic Analysis of the Design Space of Input Devices , 1990, Hum. Comput. Interact..

[56]  Simon Rogers,et al.  AnglePose: robust, precise capacitive touch tracking via 3d orientation estimation , 2011, CHI.

[57]  Ian Oakley,et al.  Fingers and Angles , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[58]  Xiang Cao,et al.  Detecting and leveraging finger orientation for interaction with direct-touch surfaces , 2009, UIST '09.

[59]  Xiang 'Anthony' Chen,et al.  The fat thumb: using the thumb's contact size for single-handed mobile interaction , 2012, Mobile HCI.