Virtual exertions: A user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation

Virtual Reality environments have the ability to present users with rich visual representations of simulated environments. However, means to interact with these types of illusions are generally unnatural in the sense that they do not match the methods humans use to grasp and move objects in the physical world. We demonstrate a system that enables users to interact with virtual objects with natural body movements by combining visual information, kinesthetics and biofeedback from electromyograms (EMG). Our method allows virtual objects to be grasped, moved and dropped through muscle exertion classification based on physical world masses. We show that users can consistently reproduce these calibrated exertions, allowing them to interface with objects in a novel way.

[1]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[2]  Desney S. Tan,et al.  Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces , 2008, CHI.

[3]  Enrico Costanza,et al.  EMG as a Subtle Input Interface for Mobile Computing , 2004, Mobile HCI.

[4]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[5]  Michael Girard,et al.  Computer animation of knowledge-based human grasping , 1991, SIGGRAPH.

[6]  Don Burns,et al.  Open Scene Graph A: Introduction, B: Examples and Applications , 2004 .

[7]  Mel Slater,et al.  The Influence of Body Movement on Subjective Presence in Virtual Environments , 1998, Hum. Factors.

[8]  Desney S. Tan,et al.  Making muscle-computer interfaces more practical , 2010, CHI.

[9]  Pattie Maes,et al.  Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures , 2007, CHI.

[10]  Jorge Barrio,et al.  Two-Hand Virtual Object Manipulation Based on Networked Architecture , 2010, EuroHaptics.

[11]  Daniel Thalmann,et al.  Multi-finger manipulation of virtual objects , 1996, VRST.

[12]  Daniel Thalmann,et al.  A Hand Control and Automatic Grasping System for Synthetic Actors , 1994, Comput. Graph. Forum.

[13]  Pattie Maes,et al.  EMG For Subtle, Intimate Interfaces , 2008 .

[14]  Peter J Keir,et al.  Crosstalk in surface electromyography of the proximal forearm during gripping tasks. , 2003, Journal of electromyography and kinesiology : official journal of the International Society of Electrophysiological Kinesiology.

[15]  Jorge Barrio,et al.  MasterFinger: Multi-finger Haptic Interface for Collaborative Environments , 2008, EuroHaptics.

[16]  Reinhard Klein,et al.  3D Interaction Techniques for 6 DOF Markerless Hand-Tracking , 2009 .

[17]  Karthik Vishwanathan,et al.  Upper limb muscle imbalance in tennis elbow: A functional and electromyographic assessment , 2007, Journal of orthopaedic research : official publication of the Orthopaedic Research Society.

[18]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[19]  Mel Slater,et al.  Measuring Presence: A Response to the Witmer and Singer Presence Questionnaire , 1999, Presence.

[20]  Daniel Thalmann,et al.  Consistent Grasping in Virtual Environments based on the Interactive Grasping Automata , 1995, Virtual Environments.

[21]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[22]  Antonella De Angeli,et al.  Integration and synchronization of input modes during multimodal human-computer interaction , 1997, CHI.

[23]  Gabriel Zachmann,et al.  Virtual Reality in Assembly Simulation , 2001 .

[24]  Jan Stage,et al.  Handbook of Research on User Interface Design and Evaluation for Mobile Technology , 2008 .

[25]  Jasper Bragt Do Exertion Interfaces Provide Better Exercise Environments , 2005 .

[26]  Don Burns,et al.  Open Scene Graph A: Introduction, B: Examples and Applications , 2004, IEEE Conference on Virtual Reality and 3D User Interfaces.

[27]  Woodrow Barfield,et al.  The effect of update rate on the sense of presence within virtual environments , 1995, Virtual Reality.

[28]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[29]  Hanna Strömberg,et al.  A group game played in interactive virtual space: design and evaluation , 2002, DIS '02.

[30]  Stefan Agamanolis,et al.  Exertion interfaces: sports over a distance for social bonding and fun , 2003, CHI '03.

[31]  Gabriel Zachmann,et al.  Virtual reality in assembly simulation: collision detection, simulation algorithms, and interaction techniques , 2000 .

[32]  Alicia M. Gibb,et al.  NEW MEDIA ART, DESIGN, AND THE ARDUINO MICROCONTROLLER: A MALLEABLE TOOL , 2010 .

[33]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[34]  Michitaka Hirose,et al.  Fine Object Manipulation in Virtual Environment , 1995, Virtual Environments.

[35]  Desney S. Tan,et al.  Enhancing input on and above the interactive surface with muscle sensing , 2009, ITS '09.

[36]  Enrico Costanza,et al.  Toward subtle intimate interfaces for mobile devices using an EMG controller , 2005, CHI.

[37]  Stephen H M Brown,et al.  Co-activation alters the linear versus non-linear impression of the EMG-torque relationship of trunk muscles. , 2008, Journal of biomechanics.

[38]  Andrew Frost New Media Art , 2005 .