Sensing techniques for tablet+stylus interaction

We explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen + touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his fingers. We can distinguish bare-handed inputs, such as drag and pinch gestures produced by the nonpreferred hand, from touch gestures produced by the hand holding the pen, which necessarily impart a detectable motion signal to the stylus. We can sense which hand grips the tablet, and determine the screen's relative orientation to the pen. By selectively combining these signals and using them to complement one another, we can tailor interaction to the context, such as by ignoring unintentional touch inputs while writing, or supporting contextually-appropriate tools such as a magnifier for detailed stroke work that appears when the user pinches with the pen tucked between his fingers. These and other techniques can be used to impart new, previously unanticipated subtleties to pen + touch interaction on tablets.

[1]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[2]  Hongan Wang,et al.  Tilt menu: using the 3D orientation information of pen devices to extend the selection capability of pen-based user interfaces , 2008, CHI.

[3]  Fabrice Matulic,et al.  Supporting active reading on pen and touch-operated tabletops , 2012, AVI.

[4]  Christian Heath,et al.  Mobility in collaboration , 1998, CSCW '98.

[5]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[6]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[7]  Shwetak N. Patel,et al.  GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones , 2012, UIST.

[8]  V. Manera,et al.  Grasping intentions: from thought experiments to empirical evidence , 2012, Front. Hum. Neurosci..

[9]  Nicholas Chen,et al.  TextTearing: opening white space for digital ink annotation , 2013, UIST.

[10]  Frédéric D Claptien,et al.  pencil was Paper , 2014 .

[11]  Koji Yatani,et al.  The 1line keyboard: a QWERTY layout in a single line , 2011, UIST '11.

[12]  V. Michael Bove,et al.  Graspables: grasp-recognition as a user interface , 2009, CHI.

[13]  Y. Guiard,et al.  Writing postures in left-handers: Inverters are hand-crossers , 1984, Neuropsychologia.

[14]  Bernt Schiele,et al.  Smart-Its Friends: A Technique for Users to Easily Establish Connections between Smart Artefacts , 2001, UbiComp.

[15]  Dominik Schmidt PhoneTouch: a technique for direct phone interaction on surfaces , 2010, ITS '10.

[16]  Chris Harrison,et al.  Whack gestures: inexact and inattentive interaction with mobile devices , 2010, TEI '10.

[17]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[18]  Albrecht Schmidt,et al.  Advanced Interaction in Context , 1999, HUC.

[19]  Andrea H. Mason,et al.  Grip forces when passing an object to a partner , 2005, Experimental Brain Research.

[20]  Anoop Gupta,et al.  The pen is mightier: understanding stylus behaviour while inking on tablets , 2014, Graphics Interface.

[21]  Saul Greenberg,et al.  Cross-device interaction via micro-mobility and f-formations , 2012, UIST.

[22]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[23]  Andrea Bunt,et al.  A-coord input: coordinating auxiliary input streams for augmenting contextual pen-based interactions , 2012, CHI.

[24]  Kee-Eung Kim,et al.  Hand Grip Pattern Recognition for Mobile User Interfaces , 2006, AAAI.

[25]  Ann-Sofie Selin Pencil grip : a descriptive model and four empirical studies , 2003 .

[26]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[27]  David Lee,et al.  PhantomPen: virtualization of pen head for digital drawing free from pen occlusion & visual parallax , 2012, UIST '12.

[28]  Eric Lecolinet,et al.  TimeTilt: Using Sensor-Based Gestures to Travel through Multiple Applications on a Mobile Device , 2009, INTERACT.

[29]  Erwin K. Welsch,et al.  A Closer Look at…: Accessing Information through Networks , 1991 .

[30]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[31]  Xiaojun Bi,et al.  Natural use profiles for the pen: an empirical exploration of pressure, tilt, and azimuth , 2012, CHI.

[32]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[33]  Dongwook Yoon TextTearing : Expanding Whitespace for Digital Ink Annotation , 2013 .

[34]  Jiro Tanaka,et al.  Stylus Enhancement to Enrich Interaction with Computers , 2007, HCI.

[35]  Xiang Cao,et al.  Enhancing naturalness of pen-and-tablet drawing through context sensing , 2011, ITS '11.

[36]  Mahsan Rofouei,et al.  Your phone or mine?: fusing body, touch and device sensing for multi-user device-display interaction , 2012, CHI.

[37]  Abigail Sellen,et al.  How knowledge workers use the web , 2002, CHI.

[38]  Ravin Balakrishnan,et al.  Codex: a dual screen tablet computer , 2009, CHI.

[39]  Robert E. Mahony,et al.  Nonlinear Complementary Filters on the Special Orthogonal Group , 2008, IEEE Transactions on Automatic Control.

[40]  Xiang 'Anthony' Chen,et al.  Motion and context sensing techniques for pen computing , 2013, Graphics Interface.

[41]  Mike Y. Chen,et al.  iGrasp: grasp-based adaptive keyboard for mobile devices , 2013, CHI Extended Abstracts.

[42]  Robert Xiao,et al.  Probabilistic palm rejection using spatiotemporal touch features and iterative classification , 2014, CHI.

[43]  Peter Brandl,et al.  Occlusion-aware menu design for digital tabletops , 2009, CHI Extended Abstracts.

[44]  Christine L. MacKenzie,et al.  The Grasping Hand , 2011, The Grasping Hand.

[45]  Wendy E. Mackay,et al.  BiTouch and BiPad: designing bimanual interaction for hand-held tablets , 2012, CHI.

[46]  Itiro Siio,et al.  Mobile interaction using paperweight metaphor , 2006, CHI EA '06.

[47]  Daniel Vogel,et al.  Conté: multimodal input inspired by an artist's crayon , 2011, UIST.

[48]  Blake Hannaford,et al.  "Are You with Me?" - Using Accelerometers to Determine If Two Devices Are Carried by the Same Person , 2004, Pervasive.

[49]  François Guimbretière,et al.  FlexAura: a flexible near-surface range sensor , 2012, UIST '12.