Magic finger: always-available input through finger instrumentation

We present Magic Finger, a small device worn on the fingertip, which supports always-available input. Magic Finger inverts the typical relationship between the finger and an interactive surface: with Magic Finger, we instrument the user's finger itself, rather than the surface it is touching. Magic Finger senses touch through an optical mouse sensor, enabling any surface to act as a touch screen. Magic Finger also senses texture through a micro RGB camera, allowing contextual actions to be carried out based on the particular surface being touched. A technical evaluation shows that Magic Finger can accurately sense 22 textures with an accuracy of 98.9%. We explore the interaction design space enabled by Magic Finger, and implement a number of novel interaction techniques that leverage its unique capabilities.

[1]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[2]  Jacob O. Wobbrock,et al.  Portico: tangible interaction on and around a tablet , 2011, UIST.

[3]  Daniel M. Johnson,et al.  Enhancing physicality in touch interaction with programmable friction , 2011, CHI.

[4]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[5]  Albrecht Schmidt,et al.  Evaluating capacitive touch input on clothes , 2008, Mobile HCI.

[6]  Chris Harrison,et al.  Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces , 2008, UIST '08.

[7]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[8]  Chris Harrison,et al.  Lightweight material detection for placement-aware mobile computing , 2008, UIST '08.

[9]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[10]  Guanling Chen,et al.  A Survey of Context-Aware Mobile Computing Research , 2000 .

[11]  Michael J. McGuffin,et al.  FaST Sliders: Integrating Marking Menus and the Adjustment , 2002, Graphics Interface.

[12]  Patrick Baudisch,et al.  Disappearing mobile devices , 2009, UIST '09.

[13]  Tovi Grossman,et al.  Implanted user interfaces , 2012, CHI.

[14]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[15]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[16]  Patrick Baudisch,et al.  Modular and deformable touch-sensitive surfaces based on time domain reflectometry , 2011, UIST.

[17]  Patrick Baudisch,et al.  Bootstrapper: recognizing tabletop users by their shoes , 2012, CHI.

[18]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[19]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[20]  Desney S. Tan,et al.  Your noise is my command: sensing gestures using the body as an antenna , 2011, CHI.

[21]  Chris Harrison,et al.  Texture displays: a passive approach to tactile presentation , 2009, CHI.

[22]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[23]  Xiaojun Bi,et al.  Magic desk: bringing multi-touch surfaces into desktop work , 2011, CHI.

[24]  Patrick Baudisch,et al.  Soap: a pointing device that works in mid-air , 2006, UIST.

[25]  Chris Harrison,et al.  PocketTouch: through-fabric capacitive touch input , 2011, UIST '11.

[26]  Paolo Bellavista,et al.  A survey of context data distribution for mobile ubiquitous systems , 2012, CSUR.

[27]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[28]  Xing-Dong Yang,et al.  See me, see you: a lightweight method for discriminating user touches on tabletop displays , 2012, CHI.

[29]  Jan O. Borchers,et al.  FingerFlux: near-surface haptic feedback on tabletops , 2011, UIST.

[30]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[31]  Andrew D. Wilson Using a depth camera as a touch sensor , 2010, ITS '10.

[32]  Patrick Baudisch,et al.  Stitching: pen gestures that span multiple displays , 2004, AVI.

[33]  Takashi Maeno,et al.  Development of a Texture Sensor Emulating the Tissue Structure and Perceptual Mechanism of Human Fingers , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[34]  Sy-Yen Kuo,et al.  iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers , 2010, CHI.

[35]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[36]  Matti Pietikäinen,et al.  Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[37]  Chris Harrison,et al.  Where to locate wearable displays?: reaction time performance of visual alerts from tip to toe , 2009, CHI.

[38]  Chris Harrison,et al.  On-body interaction: armed and dangerous , 2012, TEI.

[39]  Patrick Baudisch,et al.  Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device , 2011, UIST.

[40]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[41]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[42]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[43]  Saul Greenberg,et al.  What caused that touch?: expressive interaction with a surface through fiduciary-tagged gloves , 2010, ITS '10.

[44]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.