An Interaction Model for Grasp-Aware Tangibles on Interactive Surfaces

Tangibles on interactive surfaces enable users to physically manipulate digital content by placing, manipulating, or removing a tangible object. However, the information whether and how a user grasps these objects has not been mapped out for tangibles on interactive surfaces so far. Based on Buxton's Three-State Model for graphical input, we present an interaction model that describes input on tangibles that are aware of the user's grasp. We present two examples showing how the user benefits from this extended interaction model. Furthermore, we show how the interaction with other existing tangibles for interactive tabletops can be modeled.

[1]  Michael Haller,et al.  Geckos: combining magnets and pressure images to enable new tangible-object design and interaction , 2011, CHI.

[2]  Jon Trinder,et al.  The Humane Interface: New Directions for Designing Interactive Systems , 2002, Interact. Learn. Environ..

[3]  Ivan Poupyrev,et al.  PAPILLON: designing curved display surfaces with printed optics , 2013, UIST.

[4]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[5]  Li-Wei Chan,et al.  TUIC: enabling tangible interaction on capacitive multi-touch displays , 2011, CHI.

[6]  Andreas Butz,et al.  Optical pressure sensing for tangible user interfaces , 2011, ITS '11.

[7]  Raphael Wimmer,et al.  FlyEye: grasp-sensitive surfaces using optical fiber , 2010, TEI '10.

[8]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[9]  Michael Rohs,et al.  CapWidgets: tangile widgets versus multi-touch controls on mobile devices , 2011, CHI Extended Abstracts.

[10]  Yuichi Itoh,et al.  PUCs: detecting transparent, passive untouched capacitive widgets on unmodified multi-touch displays , 2013, ITS.

[11]  Hiroshi Ishii,et al.  A comparison of spatial organization strategies in graphical and tangible user interfaces , 2000, DARE '00.

[12]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[13]  William Buxton,et al.  A three-state model of graphical input , 1990, INTERACT.

[14]  Sebastian Boring,et al.  HandSense: discriminating different ways of grasping and holding a tangible user interface , 2009, Tangible and Embedded Interaction.

[15]  Andreas Butz,et al.  HapTouch and the 2+1 state model: potentials of haptic feedback on touch based in-vehicle information systems , 2010, AutomotiveUI.

[16]  Jan O. Borchers,et al.  Madgets: actuating widgets on interactive tabletops , 2010, UIST.

[17]  James D. Hollan,et al.  SLAP widgets: bridging the gap between virtual and physical controls on tabletops , 2009, CHI.

[18]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[19]  Stefanie Müller,et al.  CapStones and ZebraWidgets: sensing stacks of building blocks, dials and sliders on capacitive touch screens , 2012, CHI.