CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring

This paper presents CyclopsRing, a ring-style fisheye imaging wearable device that can be worn on hand webbings to en- able whole-hand and context-aware interactions. Observing from a central position of the hand through a fisheye perspective, CyclopsRing sees not only the operating hand, but also the environmental contexts that involve with the hand-based interactions. Since CyclopsRing is a finger-worn device, it also allows users to fully preserve skin feedback of the hands. This paper demonstrates a proof-of-concept device, reports the performance in hand-gesture recognition using random decision forest (RDF) method, and, upon the gesture recognizer, presents a set of interaction techniques including on-finger pinch-and-slide input, in-air pinch-and-motion input, palm-writing input, and their interactions with the environ- mental contexts. The experiment obtained an 84.75% recognition rate of hand gesture input from a database of seven hand gestures collected from 15 participants. To our knowledge, CyclopsRing is the first ring-wearable device that supports whole-hand and context-aware interactions.

[1]  Miguel Bruns Alonso,et al.  GaussStudio: Designing Seamless Tangible Interactions on Portable Displays , 2016, Tangible and Embedded Interaction.

[2]  Klaus H. Hinrichs,et al.  DigiTap: an eyes-free VR/AR symbolic input device , 2014, VRST '14.

[3]  Li-Wei Chan,et al.  ThirdHand: wearing a robotic arm to experience rich force feedback , 2015, SIGGRAPH Asia Emerging Technologies.

[4]  Li-Wei Chan,et al.  GaussBits: magnetic tangible bits for portable and occlusion-free near-surface interactions , 2013, CHI Extended Abstracts.

[5]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[6]  Bing-Yu Chen,et al.  SonarWatch: appropriating the forearm as a slider bar , 2011, SA '11.

[7]  Bing-Yu Chen,et al.  GaussBrush: Drawing with Magnetic Stylus , 2012, SIGGRAPH 2012.

[8]  Adiyan Mujibiya,et al.  The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation , 2013, ITS.

[9]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[10]  Otmar Hilliges,et al.  In-air gestures around unmodified mobile devices , 2014, UIST.

[11]  Ken Hinckley,et al.  LightRing: always-available 2D input on any surface , 2014, UIST.

[12]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[13]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[14]  Sy-Yen Kuo,et al.  iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers , 2010, CHI.

[15]  Li-Wei Chan,et al.  AnyButton: unpowered, modeless and highly available mobile input using unmodified clothing buttons , 2014, AH.

[16]  Li-Wei Chan,et al.  Dart-It: interacting with a remote display by throwing your finger touch , 2014, SIGGRAPH Posters.

[17]  Da-Yuan Huang,et al.  Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices , 2015, CHI.

[18]  Li-Wei Chan,et al.  WonderLens: Optical Lenses and Mirrors for Tangible Interactions on Printed Paper , 2015, CHI.

[19]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[20]  Wen-Huang Cheng,et al.  FingerPad: private and subtle interaction using fingertips , 2013, UIST.

[21]  Li-Wei Chan,et al.  Using point-light movement as peripheral visual guidance for scooter navigation , 2015, AH.

[22]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[23]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[24]  Sy-Yen Kuo,et al.  MemoIcon: using everyday objects as physical icons , 2009, SIGGRAPH ASIA '09.

[25]  Mark Fiala,et al.  ARTag, a fiducial marker system using digital techniques , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[26]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[27]  Hirotaka Osawa,et al.  iRing: intelligent ring using infrared reflection , 2012, UIST.

[28]  Li-Wei Chan,et al.  FlexiBend: Enabling Interactivity of Multi-Part, Deformable Fabrications Using Single Shape-Sensing Strip , 2015, UIST.

[29]  Da-Yuan Huang,et al.  GaussBricks: magnetic building blocks for constructive tangible interactions on portable displays , 2014, CHI.

[30]  Joseph A. Paradiso,et al.  WristFlex: low-power gesture input with wrist-worn pressure sensors , 2014, UIST.

[31]  Sean Gustafson,et al.  PinchWatch: A Wearable Device for One-Handed Microinteractions , 2010 .

[32]  Li-Wei Chan,et al.  Tracking magnetics above portable displays , 2013, SIGGRAPH '13.

[33]  Masamichi Shimosaka,et al.  Hand shape classification with a wrist contour sensor: development of a prototype device , 2011, UbiComp '11.

[34]  Jun Rekimoto,et al.  Brainy hand: an ear-worn hand gesture interaction device , 2009, CHI Extended Abstracts.

[35]  Bing-Yu Chen,et al.  Pub - point upon body: exploring eyes-free interaction and methods on an arm , 2011, UIST.

[36]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[37]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[38]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[39]  Rong-Hao Liang Augmenting the input space of portable displays using add-on hall-sensor grid , 2013, UIST '13 Adjunct.

[40]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[41]  Li-Wei Chan,et al.  fStrip: a malleable shape-retaining wearable strip for interface on-demand , 2015, UbiComp/ISWC Adjunct.

[42]  Otmar Hilliges,et al.  Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard , 2014, CHI.

[43]  Li-Wei Chan,et al.  LEaD: Utilizing Light Movement as Peripheral Visual Guidance for Scooter Navigation , 2015, MobileHCI.

[44]  Xing-Dong Yang,et al.  Surround-see: enabling peripheral vision on smartphones during active use , 2013, UIST.

[45]  Mandy Eberhart,et al.  Decision Forests For Computer Vision And Medical Image Analysis , 2016 .

[46]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[47]  Li-Wei Chan,et al.  GaussStones: shielded magnetic tangibles for multi-token interactions on portable displays , 2014, UIST.

[48]  Suranga Nanayakkara,et al.  EyeRing: a finger-worn input device for seamless interactions with our surroundings , 2013, AH.

[49]  Bing-Yu Chen,et al.  GaussSketch: add-on magnetic sensing for natural sketching on smartphones , 2012, SIGGRAPH '12.

[50]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[51]  Sean White,et al.  uTrack: 3D input using two magnetic sensors , 2013, UIST.

[52]  Li-Wei Chan,et al.  Direct view manipulation for drone photography , 2015, SIGGRAPH Asia Posters.

[53]  Bing-Yu Chen,et al.  GaussSense: attachable stylus sensing using magnetic sensor grid , 2012, UIST '12.

[54]  Bing-Yu Chen,et al.  GaussStarter: Prototyping Analog Hall-Sensor Grids with Breadboards , 2015, UIST.

[55]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.