Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices

This paper presents Cyclops, a single-piece wearable device that sees its user's whole body postures through an ego-centric view of the user that is obtained through a fisheye lens at the center of the user's body, allowing it to see only the user's limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the user's body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device, and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 participants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.

[1]  Diane J. Cook,et al.  Simple and Complex Activity Recognition through Smart Phones , 2012, 2012 Eighth International Conference on Intelligent Environments.

[2]  Max Mühlhäuser,et al.  EarPut: augmenting behind-the-ear devices for ear-based interaction , 2013, CHI Extended Abstracts.

[3]  Bing-Yu Chen,et al.  Pub - point upon body: exploring eyes-free interaction and methods on an arm , 2011, UIST.

[4]  Guoliang Xing,et al.  PBN: towards practical activity recognition using smartphone-based body sensor networks , 2011, SenSys.

[5]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[6]  Antonio Criminisi,et al.  Decision Forests for Computer Vision and Medical Image Analysis , 2013, Advances in Computer Vision and Pattern Recognition.

[7]  Otmar Hilliges,et al.  Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard , 2014, CHI.

[8]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[9]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[10]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[11]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[12]  James W. Davis,et al.  The Recognition of Human Movement Using Temporal Templates , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Sean White,et al.  uTrack: 3D input using two magnetic sensors , 2013, UIST.

[14]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[15]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[16]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[17]  Wen-Huang Cheng,et al.  FingerPad: private and subtle interaction using fingertips , 2013, UIST.

[18]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[19]  Yuta Sugiura,et al.  SenSkin: adapting skin as a soft interface , 2013, UIST.

[20]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[21]  Jeremy Scott,et al.  Sensing foot gestures from the pocket , 2010, UIST.

[22]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[23]  Otmar Hilliges,et al.  In-air gestures around unmodified mobile devices , 2014, UIST.

[24]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[25]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[26]  Yaser Sheikh,et al.  Motion capture from body-mounted cameras , 2011, ACM Trans. Graph..

[27]  Desney S. Tan,et al.  Humantenna: using the body as an antenna for real-time whole-body interaction , 2012, CHI.

[28]  David Sweeney,et al.  Learning to be a depth camera for close-range human capture and interaction , 2014, ACM Trans. Graph..

[29]  Md. Atiqur Rahman Ahad,et al.  Motion history image: its variants and applications , 2012, Machine Vision and Applications.

[30]  David W. Murray,et al.  On the Choice and Placement of Wearable Vision Sensors , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[31]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[32]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[33]  Dan Morris,et al.  RecoFit: using a wearable sensor to find, recognize, and count repetitive exercises , 2014, CHI.

[34]  Jun Rekimoto,et al.  Brainy hand: an ear-worn hand gesture interaction device , 2009, CHI Extended Abstracts.

[35]  Jihong Lee,et al.  Real-Time Motion Capture for a Human Body using Accelerometers , 2001, Robotica.

[36]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[37]  Adiyan Mujibiya,et al.  The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation , 2013, ITS.