Sensing Posture-Aware Pen+Touch Interaction on Tablets

Many status-quo interfaces for tablets with pen + touch input capabilities force users to reach for device-centric UI widgets at fixed locations, rather than sensing and adapting to the user-centric posture. To address this problem, we propose sensing techniques that transition between various nuances of mobile and stationary use via postural awareness. These postural nuances include shifting hand grips, varying screen angle and orientation, planting the palm while writing or sketching, and detecting what direction the hands approach from. To achieve this, our system combines three sensing modalities: 1) raw capacitance touchscreen images, 2) inertial motion, and 3) electric field sensors around the screen bezel for grasp and hand proximity detection. We show how these sensors enable posture-aware pen+touch techniques that adapt interaction and morph user interface elements to suit fine-grained contexts of body-, arm-, hand-, and grip-centric frames of reference.

[1]  Ricardo Langner,et al.  SleeD: Using a Sleeve Display to Interact with Touch-sensitive Display Walls , 2014, ITS '14.

[2]  Shumin Zhai,et al.  Command strokes with and without preview: using pen gestures on keyboard for command selection , 2007, CHI.

[3]  Peter Brandl,et al.  Occlusion-aware menu design for digital tabletops , 2009, CHI Extended Abstracts.

[4]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[5]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[6]  William Buxton,et al.  WritLarge: Ink Unleashed by Unified Scope, Action, & Zoom , 2017, CHI.

[7]  Fabrice Matulic,et al.  Unimanual Pen+Touch Input Using Variations of Precision Grip Postures , 2018, UIST.

[8]  I.,et al.  Fitts' Law as a Research and Design Tool in Human-Computer Interaction , 1992, Hum. Comput. Interact..

[9]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[10]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[11]  Xiaojun Bi,et al.  Informal information gathering techniques for active reading , 2012, CHI.

[12]  John F. Hughes,et al.  SKETCH: An Interface for Sketching 3D Scenes , 1996, SIGGRAPH.

[13]  Wendy E. Mackay,et al.  Context matters: Evaluating Interaction Techniques with the CIS Model , 2004, BCS HCI.

[14]  Michael Rohs,et al.  CapWidgets: tangile widgets versus multi-touch controls on mobile devices , 2011, CHI Extended Abstracts.

[15]  Darren Leigh,et al.  GhostID: Enabling Non-Persistent User Differentiation in Frequency-Division Capacitive Multi-Touch Sensors , 2017, CHI.

[16]  Fabrice Matulic,et al.  Hand Contact Shape Recognition for Posture-Based Tabletop Widgets and Interaction , 2017, ISS.

[17]  Ming-Sui Lee,et al.  iRotateGrasp: automatic screen rotation based on grasp of mobile devices , 2013, CHI Extended Abstracts.

[18]  Fumihisa Shibata,et al.  Forearm menu: using forearm as menu widget on tabletop system , 2013, ITS.

[19]  Benjamin B. Bederson,et al.  Interfaces for staying in the flow , 2004, UBIQ.

[20]  Björn Hartmann,et al.  Pictionaire: supporting collaborative design work by integrating physical and digital artifacts , 2010, CSCW '10.

[21]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[22]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[23]  Eric Horvitz,et al.  Foreground and background interaction with sensor-enhanced mobile devices , 2005, TCHI.

[24]  Michel Beaudouin-Lafon,et al.  SPad: a bimanual interaction technique for productivity applications on multi-touch tablets , 2014, CHI Extended Abstracts.

[25]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[26]  Daniel J. Wigdor,et al.  Zero-latency tapping: using hover information to predict touch locations and eliminate touchdown latency , 2014, UIST.

[27]  Daniel Vogel,et al.  Hand occlusion with tablet-sized direct pen input , 2009, CHI.

[28]  Daniel Vogel,et al.  Hand occlusion on a multi-touch tabletop , 2012, CHI.

[29]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[30]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[31]  Da-Yuan Huang,et al.  GaussBricks: magnetic building blocks for constructive tangible interactions on portable displays , 2014, CHI.

[32]  Aniket Kittur,et al.  Supporting Mobile Sensemaking Through Intentionally Uncertain Highlighting , 2016, UIST.

[33]  Anoop Gupta,et al.  Exploring and Understanding Unintended Touch during Direct Pen Interaction , 2014, TCHI.

[34]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[35]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[36]  Human Factors in Computing Systems, CHI '97: Looking to the Future, Extended Abstracts, Atlanta, Georgia, USA, March 22-27, 1997 , 1997, CHI Extended Abstracts.

[37]  Andruid Kerne,et al.  High-performance pen + touch modality interactions: a real-time strategy game eSports context , 2012, UIST.

[38]  Xiang Cao,et al.  Enhancing naturalness of pen-and-tablet drawing through context sensing , 2011, ITS '11.

[39]  William Buxton,et al.  Issues in combining marking and direct manipulation techniques , 1991, UIST '91.

[40]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.

[41]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[42]  Frank M. Shipman,et al.  Formality Considered Harmful: Experiences, Emerging Themes, and Directions on the Use of Formal Representations in Interactive Systems , 1999, Computer Supported Cooperative Work (CSCW).

[43]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[44]  Hui-Shyong Yeo,et al.  Project Zanzibar: A Portable and Flexible Tangible Interaction Platform , 2018, CHI.

[45]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[46]  John Williamson,et al.  Detecting Swipe Errors on Touchscreens using Grip Modulation , 2016, CHI.

[47]  Robert C. Zeleznik,et al.  Hands-on math: a page-based multi-touch and pen desktop for technical work and problem solving , 2010, UIST.

[48]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[49]  V. Michael Bove,et al.  Graspables: grasp-recognition as a user interface , 2009, CHI.

[50]  Andrew M. Webb,et al.  Wearables as Context for Guiard-abiding Bimanual Touch , 2016, UIST.

[51]  Patrick Olivier,et al.  Using IMUs to Identify Supervisors on Touch Devices , 2015, INTERACT.

[52]  Hemant Bhaskar Surale,et al.  Experimental Analysis of Mode Switching Techniques in Touch-based User Interfaces , 2017, CHI.

[53]  William Buxton,et al.  An exploration into supporting artwork orientation in the user interface , 1999, CHI '99.

[54]  Myron W. Krueger,et al.  Artificial reality II , 1991 .

[55]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[56]  Jacob O. Wobbrock,et al.  Portico: tangible interaction on and around a tablet , 2011, UIST.

[57]  Robert Xiao,et al.  CapAuth: Identifying and Differentiating User Handprints on Commodity Capacitive Touchscreens , 2015, ITS.

[58]  Emily B. Moore Tilting the Tablet: The Effect of Tablet Tilt on Hand Occlusion , 2015, CHI Extended Abstracts.

[59]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[60]  Kenneth A. Pier,et al.  Issues for Location-Independent Interfaces , 1992 .

[61]  William Buxton,et al.  Tracking menus , 2003, UIST '03.

[62]  John C. Tang,et al.  Three's company: understanding communication channels in three-way distributed collaboration , 2010, CSCW '10.

[63]  D. Ariely,et al.  The 'IKEA Effect': When Labor Leads to Love , 2011 .

[64]  Kenton O'Hara,et al.  Pre-Touch Sensing for Mobile Interaction , 2016, CHI.

[65]  Thomas P. Moran,et al.  Embodied User Interfaces: Towards Invisible User Interfaces , 1998, EHCI.

[66]  Senaka Buthpitiya,et al.  Bodyprint: Biometric User Identification on Mobile Devices Using the Capacitive Touchscreen to Scan Body Parts , 2015, CHI.

[67]  William Buxton,et al.  Thumb + Pen Interaction on Tablets , 2017, CHI.

[68]  Itiro Siio,et al.  Mobile interaction using paperweight metaphor , 2006, CHI EA '06.

[69]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[70]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[71]  Albrecht Schmidt,et al.  Advanced Interaction in Context , 1999, HUC.

[72]  Wendy E. Mackay,et al.  Which interaction technique works when?: floating palettes, marking menus and toolglasses support different task strategies , 2002, AVI '02.

[73]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[74]  Ken Hinckley,et al.  A background perspective on touch as a multimodal (and multisensor) construct , 2017, The Handbook of Multimodal-Multisensor Interfaces, Volume 1.

[75]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[76]  Abigail Sellen,et al.  The Prevention of Mode Errors Through Sensory Feedback , 1992, Hum. Comput. Interact..

[77]  D. Hoang FLOW: The Psychology of Optimal Experience , 2018 .

[78]  Raimund Dachselt,et al.  Investigating multi-touch and pen gestures for diagram editing on interactive surfaces , 2009, ITS '09.

[79]  Xiang 'Anthony' Chen,et al.  Air+touch: interweaving touch & in-air gestures , 2014, UIST.

[80]  Niels Henze,et al.  PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones , 2018, CHI.

[81]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[82]  Pourang Irani,et al.  Sensing Tablet Grasp + Micro-mobility for Active Reading , 2015, UIST.

[83]  Roderick Murray-Smith,et al.  Focused and casual interactions: allowing users to vary their level of engagement , 2013, CHI.

[84]  Li-Wei Chan,et al.  Enabling tangible interaction on capacitive touch panels , 2010, UIST '10.

[85]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[86]  Christian Heath,et al.  Mobility in collaboration , 1998, CSCW '98.

[87]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[88]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[89]  Jon Trinder,et al.  The Humane Interface: New Directions for Designing Interactive Systems , 2002, Interact. Learn. Environ..

[90]  Chris Harrison,et al.  Lean and zoom: proximity-aware user interface and content magnification , 2008, CHI.

[91]  R. Belk Possessions and the Extended Self , 1988 .