Hand Contact Shape Recognition for Posture-Based Tabletop Widgets and Interaction

Tabletop interaction can be enriched by considering whole hands as input instead of only fingertips. We describe a generalised, reproducible computer vision algorithm to recognise hand contact shapes, with support for arm rejection, as well as dynamic properties like finger movement and hover. A controlled experiment shows the algorithm can detect seven different contact shapes with roughly 91% average accuracy. The effect of long sleeves and non-user specific templates is also explored. The algorithm is used to trigger, parameterise, and dynamically control menu and tool widgets, and the usability of a subset of these are qualitatively evaluated in a realistic application. Based on our findings, we formulate a number of design recommendations for hand shape-based interaction.

[1]  Jesse Hoey,et al.  Extending interactions into hoverspace using reflected light , 2011, ITS '11.

[2]  Robert Xiao,et al.  Probabilistic palm rejection using spatiotemporal touch features and iterative classification , 2014, CHI.

[3]  Ming-Kuei Hu,et al.  Visual pattern recognition by moment invariants , 1962, IRE Trans. Inf. Theory.

[4]  Domenico Prattichizzo,et al.  Using Kinect for hand tracking and rendering in wearable haptics , 2011, 2011 IEEE World Haptics Conference.

[5]  Daniel J. Wigdor,et al.  Rock & rails: extending multi-touch interactions with shape gestures to enable precise spatial manipulations , 2011, CHI.

[6]  Xiang 'Anthony' Chen,et al.  Air+touch: interweaving touch & in-air gestures , 2014, UIST.

[7]  Gilles Bailly,et al.  MultiTouch menu (MTM) , 2008, IHM '08.

[8]  Michel Beaudouin-Lafon,et al.  SPad: a bimanual interaction technique for productivity applications on multi-touch tablets , 2014, CHI Extended Abstracts.

[9]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[10]  Mike Wu,et al.  A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.

[11]  Fabrice Matulic,et al.  Pen and touch gestural environment for document editing on interactive tabletops , 2013, ITS.

[12]  Xiang Cao,et al.  ShapeTouch: Leveraging contact shape on interactive surfaces , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[13]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[14]  Carl Gutwin,et al.  HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays , 2016, CHI.

[15]  Robert Xiao,et al.  TouchTools: leveraging familiarity and skill with physical tools to augment touch interaction , 2014, CHI.

[16]  Dominik Schmidt,et al.  HandsDown: hand-contour-based user identification for interactive surfaces , 2010, NordiCHI.

[17]  Robert Xiao,et al.  Estimating 3D Finger Angle on Commodity Touchscreens , 2015, ITS.

[18]  Kenton O'Hara,et al.  Pre-Touch Sensing for Mobile Interaction , 2016, CHI.

[19]  Morten Fjeld,et al.  Towards Leaning Aware Interaction with Multitouch Tabletops , 2016, NordiCHI.

[20]  Fumihisa Shibata,et al.  Amazing forearm as an innovative interaction device and data storage on tabletop display , 2012, ITS '12.

[21]  Daniel Vogel,et al.  Hand occlusion on a multi-touch tabletop , 2012, CHI.

[22]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[23]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[24]  Zhensong Zhang,et al.  Left and right hand distinction for multi-touch tabletop interactions , 2014, IUI.

[25]  Jörg Müller,et al.  Design and evaluation of finger-count interaction: Combining multitouch gestures and menus , 2012, Int. J. Hum. Comput. Stud..

[26]  Daniel Vogel,et al.  Pin-and-Cross: A Unimanual Multitouch Technique Combining Static Touches with Crossing Selection , 2015, UIST.

[27]  Peter Brandl,et al.  Occlusion-aware menu design for digital tabletops , 2009, CHI Extended Abstracts.

[28]  Olivier Bau,et al.  Arpège: learning multitouch chord gestures vocabularies , 2013, ITS.

[29]  Alexandra Branzan Albu,et al.  Integrating touch and near touch interactions for information visualizations , 2011, CHI EA '11.

[30]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[31]  Buntarou Shizuki,et al.  HandyWidgets: local widgets pulled-out from hands , 2012, ITS.