Mime: compact, low power 3D gesture sensing for interaction with head mounted displays

We present Mime, a compact, low-power 3D sensor for unencumbered free-form, single-handed gestural interaction with head-mounted displays (HMDs). Mime introduces a real-time signal processing framework that combines a novel three-pixel time-of-flight (TOF) module with a standard RGB camera. The TOF module achieves accurate 3D hand localization and tracking, and it thus enables motion-controlled gestures. The joint processing of 3D information with RGB image data enables finer, shape-based gestural interaction. Our Mime hardware prototype achieves fast and precise 3D gestural control. Compared with state-of-the-art 3D sensors like TOF cameras, the Microsoft Kinect and the Leap Motion Controller, Mime offers several key advantages for mobile applications and HMD use cases: very small size, daylight insensitivity, and low power consumption. Mime is built using standard, low-cost optoelectronic components and promises to be an inexpensive technology that can either be a peripheral component or be embedded within the HMD unit. We demonstrate the utility of the Mime sensor for HMD interaction with a variety of application scenarios, including 3D spatial input using close-range gestures, gaming, on-the-move interaction, and operation in cluttered environments and in broad daylight conditions.

[1]  Kent Lyons,et al.  Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.

[2]  Fred H. Previc,et al.  Spatial Disorientation in Aviation , 2004 .

[3]  La-or Kovavisaruch,et al.  Source Localization Using TDOA and FDOA Measurements in the Presence of Receiver Location Errors: Analysis and Solution , 2007, IEEE Transactions on Signal Processing.

[4]  R. Lange,et al.  Solid-state time-of-flight range camera , 2001 .

[5]  Wayne Piekarski,et al.  ARQuake: the outdoor augmented reality gaming system , 2002, CACM.

[6]  Nicolai Marquardt,et al.  Extending a mobile device's interaction space through body-centric interaction , 2012, Mobile HCI.

[7]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[8]  Youngjin Hong,et al.  Mobile pointing and input system using active marker , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[9]  David A. Forsyth,et al.  Around device interaction for multiscale navigation , 2012, Mobile HCI.

[10]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[11]  Fred H. Previc,et al.  Flight Displays II: Head-Up and Helmet-Mounted Displays , 2004 .

[12]  Bruce H. Thomas,et al.  Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment , 2002, Virtual Reality.

[13]  Sabine Süsstrunk,et al.  Combining visible and near-infrared images for realistic skin smoothing , 2009, Color Imaging Conference.

[14]  Richard Szeliski,et al.  Computer Vision - Algorithms and Applications , 2011, Texts in Computer Science.

[15]  Vivek K Goyal,et al.  Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor. , 2011, Optics express.

[16]  Len Bass,et al.  User interface software , 1993 .

[17]  Adrian David Cheok,et al.  22nd International Conference on Human-Computer Interaction with Mobile Devices and Services , 2007, Lecture Notes in Computer Science.

[18]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[19]  Ross T. Smith,et al.  Hand Tracking For Low Powered Mobile AR User Interfaces , 2005, AUIC.

[20]  Robert J. K. Jacob,et al.  User interface , 2002 .

[21]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[22]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.