Comparing Real and Virtual Object Manipulation by Physiological Signals Analysis: A First Study

Virtual reality aims at reproducing reality and simulating actions like object manipulation tasks. Despite abundant past research on designing 3D interaction devices and methods to achieve close-to-real manipulation in virtual environments, strong differences exist between real and virtual object manipulation. Past work that compared between real and virtual manipulation mainly focused on user performance only. In this paper, we propose using also physiological signals, namely electromyography (EMG), to better characterize these differences. A first experiment featuring a simple pick-and-place task on a real setup and in a CAVE system showed that participants’ muscular activity reveals a clearly different spectrum in the virtual environment compared to that in reality.

[1]  T. Sadoyama,et al.  Frequency analysis of surface EMG to evaluation of muscle fatigue , 2004, European Journal of Applied Physiology and Occupational Physiology.

[2]  Antonio Krüger,et al.  The Comparison of Performance, Efficiency, and Task Solution Strategies in Real, Virtual and Dual Reality Environments , 2015, INTERACT.

[3]  Robert V. Kenyon,et al.  Training in virtual and real environments , 1995, Annals of Biomedical Engineering.

[4]  Makoto Sato,et al.  Tension based 7-DOF force feedback device: SPIDAR-G , 2002, Proceedings IEEE Virtual Reality 2002.

[5]  Christian Pere,et al.  Improvement of the real-time gesture analysis by a new mother wavelet and the application for the navigation inside a scale-one 3D system , 2013, 2013 10th IEEE International Conference on Advanced Video and Signal Based Surveillance.

[6]  Grigore C. Burdea,et al.  Force and Touch Feedback for Virtual Reality , 1996 .

[7]  Adam J. Sporka,et al.  EMG Sensors as Virtual Input Devices , 2014, MIDI '14.

[8]  George Drettakis,et al.  Finger-based manipulation in immersive spaces and the real world , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[9]  Yoshifumi Kitamura,et al.  A manipulation environment of virtual and real objects using a magnetic metaphor , 2002, VRST '02.

[10]  Michael J. Singer,et al.  Measuring Presence in Virtual Environments: A Presence Questionnaire , 1998, Presence.

[11]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[12]  Boualem Boashash,et al.  Time Frequency Analysis , 2003 .

[13]  Pinhas Ben-Tzvi,et al.  Design and Optimization of a Five-Finger Haptic Glove Mechanism , 2015 .

[14]  Christine L. MacKenzie,et al.  Physical versus virtual pointing , 1996, CHI.

[15]  Eric D. Ragan,et al.  Questioning naturalism in 3D user interfaces , 2012, CACM.

[16]  J. Flanagan,et al.  Effects of surface texture on weight perception when lifting objects with a precision grip , 1995, Perception & psychophysics.

[17]  Fumihito Kyota,et al.  Fast Grasp Synthesis for Various Shaped Objects , 2012, Comput. Graph. Forum.

[18]  Takashi Maeno,et al.  Multi-fingered exoskeleton haptic device using passive force feedback for dexterous teleoperation , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  C. Tai Unified definition of divergence, curl, and gradient , 1986 .

[20]  Russell M. Taylor,et al.  VRPN: a device-independent, network-transparent VR peripheral system , 2001, VRST '01.

[21]  Ferran Argelaguet,et al.  A survey of 3D object selection techniques for virtual environments , 2013, Comput. Graph..

[22]  Sean Follmer,et al.  Wolverine: A Wearable Haptic Interface for Grasping in VR , 2016, UIST.

[23]  Jean-Rémy Chardonnet,et al.  Dexterous Grasping Tasks Generated With an Add-on End Effector of a Haptic Feedback System , 2016, J. Comput. Inf. Sci. Eng..

[24]  Steven D. Pieper,et al.  Hands-on interaction with virtual environments , 1989, UIST '89.

[25]  F. Valero-Cuevas,et al.  Visual and Tactile Guidance of Dexterous Manipulation Tasks: An fMRI Study , 2005, Perceptual and motor skills.

[26]  Manolya Kavakli,et al.  The Usability of Speech and/or Gestures in Multi-Modal Interface Systems , 2017, ICCAE '17.