A Study of Contactless Human Computer Interaction with Virtual Environments

Currently applications and use of virtual environments is a major growth area. The different ways of interacting with these environments allow us to perform various tasks and uses of them. However, new users regularly face problems when using some devices for these environments. In this paper is studied and analyzed the contactless interactions of new users with a virtual environment. Particularly, this study proposes to analyze the frustration levels of users when they are learning and adapting. Frustration levels are acquired from an EEG reader and used to evaluate their performance and compromise, for a simple task of taking and arranging objects. Two modes of interaction have been tested: in one mode, people interact with the environment using their hands, and in the other mode, people interact with the environment using a physical tool. Both modes of contactless interaction will rely on the Leap Motion Controller. Preliminary results show relations between dispersion of levels of frustration, going from data highly dispersed to normal distributions when user is more adapted.

[1]  Silas Formunyuy Verkijika,et al.  Using a brain-computer interface (BCI) in reducing math anxiety: Evidence from South Africa , 2015, Comput. Educ..

[2]  Psychonomic Bulletin Precis ofFrustration Theory: An Analysis of Dispositional Learning and Memory , 1994 .

[3]  Liz Falconer,et al.  Experiencing sense of place in virtual and physical Avebury , 2017, Personal and Ubiquitous Computing.

[4]  Stephen R. Ellis,et al.  What are virtual environments? , 1994, IEEE Computer Graphics and Applications.

[5]  A. Nowicka,et al.  Effect of Frustration on Brain Activation Pattern in Subjects with Different Temperament , 2016, Front. Psychol..

[6]  Pedro Núñez Trujillo,et al.  A Novel Multimodal Emotion Recognition Approach for Affective Human Robot Interaction , 2015, MuSRobS@IROS.

[7]  Anton Nijholt,et al.  Affective Pacman: A Frustrating Game for Brain-Computer Interface Experiments , 2009, INTETAIN.

[8]  J. Manuel Cano Izquierdo,et al.  Are low cost Brain Computer Interface headsets ready for motor imagery applications? , 2016, Expert Syst. Appl..

[9]  Panagiotis D. Bamidis,et al.  Versatile mixed reality medical educational spaces; requirement analysis from expert users , 2017, Personal and Ubiquitous Computing.

[10]  Leo Galway,et al.  Towards emotion recognition for virtual environments: an evaluation of eeg features on benchmark dataset , 2017, Personal and Ubiquitous Computing.

[11]  Elena Mugellini,et al.  Head-Computer Interface: A Multimodal Approach to Navigate through Real and Virtual Worlds , 2011, HCI.

[12]  Sonya S. Kwak,et al.  What makes people empathize with an emotional robot?: The impact of agency and physical embodiment on human empathy for a robot , 2013, 2013 IEEE RO-MAN.

[13]  Ana Paiva,et al.  Detecting Engagement in HRI: An Exploration of Social and Task-Based Context , 2012, 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing.

[14]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[15]  Sarosh Patel,et al.  Real-Time Robot Control Using Leap Motion A Concept of Human-Robot Interaction , 2015 .

[16]  Gong Chen,et al.  Human–Robot Interaction Control of Rehabilitation Robots With Series Elastic Actuators , 2015, IEEE Transactions on Robotics.

[17]  Jacek M. Leski,et al.  Validation of Emotiv EPOC+ for extracting ERP correlates of emotional face processing , 2018 .