Tickle: a surface-independent interaction technique for grasp interfaces

We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a handheld tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.

[1]  K. Tsukada,et al.  Ubi-Finger : Gesture Input Device for Mobile Use , 2002 .

[2]  Patrick Baudisch,et al.  Back-of-device interaction allows creating very small touch devices , 2009, CHI.

[3]  Michael Rohs,et al.  Characteristics of pressure-based input for mobile devices , 2010, CHI.

[4]  Simon Rogers,et al.  AnglePose: robust, precise capacitive touch tracking via 3d orientation estimation , 2011, CHI.

[5]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[6]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[7]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[8]  Junsong Yuan,et al.  Robust hand gesture recognition based on finger-earth mover's distance with a commodity depth camera , 2011, ACM Multimedia.

[9]  Hamed Ketabdar,et al.  MagiTact: interaction with mobile devices based on compass (magnetic) sensor , 2010, IUI '10.

[10]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[11]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[12]  Joseph S. Dumas,et al.  Comparison of three one-question, post-task usability questionnaires , 2009, CHI.

[13]  Michael Rohs,et al.  HoverFlow: expanding the design space of around-device interaction , 2009, Mobile HCI.

[14]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[15]  Katrin Wolf,et al.  PinchPad: performance of touch-based gestures while grasping devices , 2012, TEI.

[16]  Willis J. Tompkins,et al.  A Real-Time QRS Detection Algorithm , 1985, IEEE Transactions on Biomedical Engineering.

[17]  Bruce Howard,et al.  Lightglove: wrist-worn virtual typing and pointing , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[18]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[19]  Patrick Baudisch,et al.  Touch input on curved surfaces , 2011, CHI.

[20]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[21]  Jane Yung-jen Hsu,et al.  Double-side multi-touch input for mobile devices , 2009, CHI Extended Abstracts.

[22]  Sean Gustafson,et al.  PinchWatch: A Wearable Device for One-Handed Microinteractions , 2010 .

[23]  Yoshinobu Tonomura,et al.  “Body coupled FingerRing”: wireless wearable keyboard , 1997, CHI.

[24]  Michael Rohs,et al.  A Taxonomy of Microinteractions: Defining Microgestures Based on Ergonomic and Scenario-Dependent Requirements , 2011, INTERACT.

[25]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[26]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[27]  Howell O. Istance,et al.  Snap clutch, a moded approach to solving the Midas touch problem , 2008, ETRA.

[28]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[29]  Tanja Schultz,et al.  Airwriting: Hands-Free Mobile Text Input by Spotting and Continuous Recognition of 3d-Space Handwriting with Inertial Sensors , 2012, 2012 16th International Symposium on Wearable Computers.