ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments

ShoeSoleSense is a proof of concept, novel body worn interface - an insole that enables location independent hands-free interaction through the feet. Forgoing hand or finger interaction is especially beneficial when the user is engaged in real world tasks. In virtual environments as moving through safety training applications is often conducted via finger input, which is not very suitable. To enable a more intuitive interaction, alternative control concepts utilize gesture control, which is usually tracked by statically installed cameras in CAVE-like-installations. Since tracking coverage is limited, problems may also occur. The introduced prototype provides a novel control concept for virtual reality as well as real life applications. Demonstrated functions include movement control in a virtual reality installation such as moving straight, turning and jumping. Furthermore the prototype provides additional feedback by heating up the feet and vibrating in dedicated areas on the surface of the insole.

[1]  Luca Turchet,et al.  Enhancing realism in virtual environments by simulating the audio-haptic sensation of walking on ground surfaces , 2012, 2012 IEEE Virtual Reality Workshops (VRW).

[2]  James R. Eagan,et al.  Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets , 2013, CHI.

[3]  Denys J. C. Matthies InEar BioFeedController: a headset for hands-free and eyes-free interaction with mobile devices , 2013, CHI Extended Abstracts.

[4]  Masayuki Nakajima,et al.  A new interface for the virtual world foot motion sensing input device , 2002, SIGGRAPH '02.

[5]  Takuya Nojima,et al.  Shoe-shaped i/o interface , 2010, UIST '10.

[6]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[7]  Michael Rohs,et al.  ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.

[8]  Michael Rohs,et al.  Tickle: a surface-independent interaction technique for grasp interfaces , 2013, TEI '13.

[9]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[10]  John C. Hart,et al.  The CAVE: audio visual experience automatic virtual environment , 1992, CACM.

[11]  Roope Raisamo,et al.  Appropriateness of foot interaction for non-accurate spatial tasks , 2004, CHI EA '04.

[12]  Matt Anderson,et al.  FreeDigiter: a contact-free device for gesture control , 2004, Eighth International Symposium on Wearable Computers.

[13]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[14]  Rudy Darken,et al.  The omni-directional treadmill: a locomotion device for virtual worlds , 1997, UIST '97.

[15]  Joseph J. LaViola,et al.  Hands-free multi-scale navigation in virtual environments , 2001, I3D '01.

[16]  Jeremy Scott,et al.  Sensing foot gestures from the pocket , 2010, UIST.

[17]  Kristopher J. Blom,et al.  ChairIO--the Chair-Based Interface , 2007 .

[18]  Max Mühlhäuser,et al.  EarPut: augmenting behind-the-ear devices for ear-based interaction , 2013, CHI Extended Abstracts.

[19]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[20]  Mary C. Whitton,et al.  Walking > walking-in-place > flying, in virtual environments , 1999, SIGGRAPH.

[21]  T. Krupenkin,et al.  Reverse electrowetting as a new approach to high-power energy harvesting , 2011, Nature communications.