A Physiological Evaluation of Immersive Experience of a View Control Method using Eyelid EMG

This paper describes that the number of blood-volume pulses (BVP) and the level of skin conductance (SC) increased more with increasing immersive impression with a view control method using eyelid electromyography in virtual environment (VE) than those with a mouse control method. We have developed the view control method and the visual feedback associated with electromyography (EMG) signals of movements of user’s eyelids. The method provides a user with more immersive experiences in a virtual environment because of strong relationship between eyelid movement and visual feedback. This paper reports a physiological evaluation experiment to compare it with a common mouse input method by measuring subjects’ physiological data of their fear of an open high place in a virtual environment. Based on the results, we find the eyelid-movement input method improves the user’s immersive impression more significantly than the mouse input method.

[1]  Pattie Maes,et al.  Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures , 2007, CHI.

[2]  Manfred Huber,et al.  Identification of static and dynamic muscle activation patterns for intuitive human/computer interfaces , 2010, PETRA '10.

[3]  Yoshiaki Tanaka,et al.  Viewpoint motion control by body position in immersive projection display , 2002, SAC '02.

[4]  Tanja Schultz,et al.  Enhancement of human computer interaction with facial electromyographic sensors , 2009, OZCHI '09.

[5]  Gerd Bruder,et al.  Presence-enhancing real walking user interface for first-person video games , 2009, Sandbox@SIGGRAPH.

[6]  Mohammad Soleymani,et al.  Affective ranking of movie scenes using physiological signals and content analysis , 2008, MS '08.

[7]  Miaolong Yuan,et al.  Interacting with 3D objects in a virtual environment using an intuitive gesture system , 2008, VRCAI '08.

[8]  Yasushi Yagi,et al.  A wide‐field‐of‐view catadioptrical head‐mounted display , 2006 .

[9]  Masaki Hayashi,et al.  Implementation of EOG-based Gaze Estimation in HMD with Head-tracker , 2008 .

[10]  Victoria Interrante,et al.  The effect of self-embodiment on distance perception in immersive virtual environments , 2008, VRST '08.

[11]  John Paulin Hansen,et al.  Low-cost gaze pointing and EMG clicking , 2009, CHI Extended Abstracts.

[12]  Michitaka Hirose,et al.  Implementation of Electromyogram Interface in CABIN Immersive Multiscreen Display , 2006, IEEE Virtual Reality Conference (VR 2006).

[13]  Tao Lin,et al.  Do physiological data relate to traditional usability indexes? , 2005, OZCHI.

[14]  Maja Pantic,et al.  Spontaneous vs. posed facial behavior: automatic analysis of brow actions , 2006, ICMI '06.

[15]  Vassil N. Alexandrov,et al.  Eye tracking and gaze vector calculation within immersive virtual environments , 2007, VRST '07.

[16]  Masaki Oshita,et al.  Motion-capture-based avatar control framework in third-person view virtual environments , 2006, ACE '06.

[17]  Michael Meehan,et al.  Physiological measures of presence in stressful virtual environments , 2002, SIGGRAPH.