Unimanual Pen+Touch Input Using Variations of Precision Grip Postures

We introduce a new pen input space by forming postures with the same hand that also grips the pen while writing, drawing, or selecting. The postures contact the multitouch surface around the pen to enable detection without special sensors. A formative study investigates the effectiveness, accuracy, and comfort of 33 candidate postures in controlled tasks. The results indicate a useful subset of postures. Using raw capacitive sensor data captured in the study, a convolutional neural network is trained to recognize 10 postures in real time. This recognizer is used to create application demonstrations for pen-based document annotation and vector drawing. A small usability study shows the approach is feasible.

[1]  Ann-Sofie Selin Pencil grip : a descriptive model and four empirical studies , 2003 .

[2]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[3]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[4]  William Buxton,et al.  Thumb + Pen Interaction on Tablets , 2017, CHI.

[5]  Daniel Vogel,et al.  Conté: multimodal input inspired by an artist's crayon , 2011, UIST.

[6]  Fabrice Matulic,et al.  Empirical evaluation of uni- and bimodal pen and touch interaction properties on digital tabletops , 2012, ITS '12.

[7]  Daniel J. Wigdor,et al.  Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces , 2008, AVI '08.

[8]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[9]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[10]  Xiaojun Bi,et al.  An exploration of pen rolling for pen-based interaction , 2008, UIST '08.

[11]  William Buxton,et al.  Tracking menus , 2003, UIST '03.

[12]  J. Elliott,et al.  A CLASSIFICATION OF MANIPULATIVE HAND MOVEMENTS , 1984, Developmental medicine and child neurology.

[13]  Hemant Bhaskar Surale,et al.  Experimental Analysis of Mode Switching Techniques in Touch-based User Interfaces , 2017, CHI.

[14]  Fong-Gong Wu,et al.  Design and evaluation approach for increasing stability and performance of touch pens in screen handwriting tasks. , 2006, Applied ergonomics.

[15]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[16]  R. Sassoon The Art and Science of Handwriting , 1993 .

[17]  Manfred Tscheligi,et al.  CHI '04 Extended Abstracts on Human Factors in Computing Systems , 2004, CHI 2004.

[18]  Fabrice Matulic,et al.  Pen and touch gestural environment for document editing on interactive tabletops , 2013, ITS.

[19]  Ka-Ping Yee,et al.  Two-handed interaction on a tablet display , 2004, CHI EA '04.

[20]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[21]  Dominik Schmidt,et al.  HandsDown: hand-contour-based user identification for interactive surfaces , 2010, NordiCHI.

[22]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[23]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI '94.

[24]  Yang Li,et al.  Experimental analysis of mode switching techniques in pen-based user interfaces , 2005, CHI.

[25]  Audrey Girouard,et al.  FlexStylus: Leveraging Bend Input for Pen Interaction , 2017, UIST.

[26]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[27]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[28]  Robert Xiao,et al.  Probabilistic palm rejection using spatiotemporal touch features and iterative classification , 2014, CHI.

[29]  Robert Xiao,et al.  TouchTools: leveraging familiarity and skill with physical tools to augment touch interaction , 2014, CHI.

[30]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[31]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[32]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[33]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[34]  Niels Henze,et al.  Estimating the Finger Orientation on Capacitive Touchscreens Using Convolutional Neural Networks , 2017, ISS.

[35]  Patrick Baudisch,et al.  The springboard: multiple modes in one spring-loaded control , 2006, CHI.

[36]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[37]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[38]  Xiaojun Bi,et al.  Acquiring and pointing: an empirical study of pen-tilt-based interaction , 2011, CHI.

[39]  J. Napier The prehensile movements of the human hand. , 1956, The Journal of bone and joint surgery. British volume.

[40]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[41]  Daniel Vogel,et al.  Direct Pen Interaction With a Conventional Graphical User Interface , 2010, Hum. Comput. Interact..

[42]  Fabrice Matulic,et al.  Hand Contact Shape Recognition for Posture-Based Tabletop Widgets and Interaction , 2017, ISS.