The Impact of Multi-sensory Stimuli on Confidence Levels for Perceptual-cognitive Tasks in VR

Supporting perceptual-cognitive tasks is an important part of our daily lives. We use rich, multi-sensory feedback through sight, sound, touch, smell, and taste to support better perceptual-cognitive things we do, such as sports, cooking, and searching for a location, and to increase our confidence in performing those tasks in daily life. Same with real life, the demand for perceptual-cognitive tasks exists in serious VR simulations such as surgical or safety training systems. However, in contrast to real life, VR simulations are typically limited to visual and auditory cues, while sometimes adding simple tactile feedback. This could make it difficult to make confident decisions in VR.In this paper, we investigate the effects of multi-sensory stimuli, namely visuals, audio, two types of tactile (floor vibration and wind), and smell in terms of the confidence levels on a location-matching task which requires a combination of perceptual and cognitive work inside a virtual environment. We also measured the level of presence when participants visited virtual places with different combinations of sensory feedback. Our results show that our multi-sensory VR system was superior to a typical VR system (vision and audio) in terms of the sense of presence and user preference. However, the subjective confidence levels were higher in the typical VR system.

[1]  Michele Dassisti,et al.  Experiencing the Sights, Smells, Sounds, and Climate of Southern Italy in VR , 2017, IEEE Computer Graphics and Applications.

[2]  Ernst Kruijff,et al.  Designed emotions: challenges and potential methodologies for improving multisensory cues to enhance user engagement in immersive systems , 2017, The Visual Computer.

[3]  Naëm Baron,et al.  How incorporation of scents could enhance immersive virtual experiences , 2014, Front. Psychol..

[4]  Konrad Paul Kording,et al.  Over my fake body: body ownership illusions for studying the multisensory basis of own-body perception , 2015, Front. Hum. Neurosci..

[5]  Simon A. Jackson,et al.  Individual differences in decision-making and confidence: capturing decision tendencies in a fictitious medical test , 2013, Metacognition and Learning.

[6]  Pamela J. Wisniewski,et al.  In Limbo: The Effect of Gradual Visual Transition Between Real and Virtual on Virtual Body Ownership Illusion and Presence , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[7]  Holger Regenbrecht,et al.  Manipulating the Experience of Reality for Rehabilitation Applications , 2014, Proceedings of the IEEE.

[8]  Gordon Pipa,et al.  Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study , 2014, Front. Behav. Neurosci..

[9]  W. Bles,et al.  Motion sickness. , 2000, Current opinion in neurology.

[10]  Jonathan D. Cohen,et al.  Rubber hands ‘feel’ touch that eyes see , 1998, Nature.

[11]  Takamichi Nakamoto,et al.  Improvement of olfactory display using solenoid valves , 2007, 2007 IEEE Virtual Reality Conference.

[12]  Jacquelyn Ford Morie,et al.  The effects of scent and game play experience on memory of a virtual environment , 2007, Virtual Reality.

[13]  Joseph Nathaniel Kaye,et al.  Symbolic olfactory display , 2001 .

[14]  Arindam Dey,et al.  The effect of multi-sensory cues on performance and experience during walking in immersive virtual environments , 2016, 2016 IEEE Virtual Reality (VR).

[15]  Antonio Frisoli,et al.  A new force-feedback arm exoskeleton for haptic interaction in virtual environments , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[16]  Neff Walker,et al.  Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[17]  Holger Regenbrecht,et al.  Referred Sensations Elicited by Video-Mediated Mirroring of Hands , 2012, PloS one.

[18]  C. Summerfield,et al.  Expectation in perceptual decision making: neural and computational mechanisms , 2014, Nature Reviews Neuroscience.

[19]  Takamichi Nakamoto,et al.  Movie with Scents Generated by Olfactory Display Using Solenoid Valves , 2006, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[20]  Cindy Dietrich Decision Making: Factors that Influence Decision Making, Heuristics Used, and Decision Outcomes , 2010 .

[21]  S. Firestein How the olfactory system makes sense of scents , 2001, Nature.

[22]  George Papagiannakis,et al.  Immersive VR decision training: telling interactive stories featuring advanced virtual human simulation technologies , 2003 .

[23]  BENJAMIN WHITE,et al.  Vision Substitution by Tactile Image Projection , 1969, Nature.

[24]  Rachel S. Herz,et al.  Aromatherapy Facts and Fictions: A Scientific Analysis of Olfactory Effects on Mood, Physiology and Behavior , 2009, The International journal of neuroscience.

[25]  Mel Slater,et al.  The Influence of Body Movement on Subjective Presence in Virtual Environments , 1998, Hum. Factors.

[26]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[27]  Ye Yuan,et al.  Is the rubber hand illusion induced by immersive virtual reality? , 2010, 2010 IEEE Virtual Reality Conference (VR).

[28]  Ramiro Gonçalves,et al.  A multisensory virtual experience model for thematic tourism: A Port wine tourism application proposal , 2017 .

[29]  Robert W. Lindeman,et al.  Haptic ChairIO: A system to study the effect of wind and floor vibration feedback on spatial orientation in VEs , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[30]  Pamela J. Wisniewski,et al.  RealME: the influence of body and hand representations on body ownership and presence , 2017, SUI.

[31]  Niels Henze,et al.  Using Presence Questionnaires in Virtual Reality , 2019, CHI.

[32]  Richard Corbett,et al.  AROMA: ambient awareness through olfaction in a messaging application , 2004, ICMI '04.

[33]  Doug A. Bowman,et al.  Virtual Reality: How Much Immersion Is Enough? , 2007, Computer.

[34]  Massimo Bergamasco,et al.  Mechanical design of a novel Hand Exoskeleton for accurate force displaying , 2009, 2009 IEEE International Conference on Robotics and Automation.

[35]  Robert W. Lindeman,et al.  Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system , 2004, VRST '04.

[36]  Maximino Bessa,et al.  Collaborative immersive authoring tool for real-time creation of multisensory VR experiences , 2019, Multimedia Tools and Applications.

[37]  Bonifaz Kaufmann,et al.  May cause dizziness: applying the simulator sickness questionnaire to handheld projector interaction , 2012, BCS HCI.

[38]  Robert W. Lindeman,et al.  Creating a Stressful Decision Making Environment for Aerial Firefighter Training in Virtual Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[39]  S. B. Davis,et al.  Smell me: engaging with an interactive olfactory game. , 2007 .

[40]  Robert W. Lindeman,et al.  The effect of vibration and low-frequency audio on full-body haptic sensations , 2015, VRST.

[41]  Vyacheslav P. Tuzlukov,et al.  Signal detection theory , 2001 .

[42]  Ken-ichi Okada,et al.  Effective Presentation Technique of Scent Using Small Ejection Quantities of Odor , 2009, 2009 IEEE Virtual Reality Conference.

[43]  Mel Slater,et al.  Using Presence Questionnaires in Reality , 2000, Presence: Teleoperators & Virtual Environments.

[44]  Jan O. Borchers,et al.  HaptiGames - Personally Fabricated for Visual Impaired , 2018, CHI PLAY.

[45]  François Bernier,et al.  A Software Architecture for Sharing Distributed Virtual Worlds , 2009, 2009 IEEE Virtual Reality Conference.

[46]  Robert S. Kennedy,et al.  Simulator Sickness Questionnaire: An enhanced method for quantifying simulator sickness. , 1993 .

[47]  Robert W. Lindeman,et al.  Wearable vibrotactile systems for virtual contact and information display , 2006, Virtual Reality.

[48]  L. M. Jones,et al.  Could olfactory displays improve data visualization? , 2004, Comput. Sci. Eng..

[49]  Paul Richard,et al.  Multi-modal virtual environments for education with haptic and olfactory feedback , 2006, Virtual Reality.

[50]  George Ghinea,et al.  The sweet smell of success: Enhancing multimedia applications with olfaction , 2012, TOMCCAP.

[51]  Christian Hansen,et al.  Highly immersive virtual reality laparoscopy simulation: development and future aspects , 2018, International Journal of Computer Assisted Radiology and Surgery.

[52]  Jan O. Borchers,et al.  HapticPong: Low Resolution Games for Visual Impaired , 2018, CHI PLAY.

[53]  Mel Slater,et al.  Enhancing Our Lives with Immersive Virtual Reality , 2016, Front. Robot. AI.

[54]  A WashburnDonald,et al.  Could Olfactory Displays Improve Data Visualization , 2004 .

[55]  Arindam Dey,et al.  An initial exploration of a multi-sensory design space: Tactile support for walking in immersive virtual environments , 2016, 2016 IEEE Symposium on 3D User Interfaces (3DUI).

[56]  Shinjiro Kawato,et al.  Projection based olfactory display with nose tracking , 2004, IEEE Virtual Reality 2004.

[57]  Enrico Rukzio,et al.  inScent: a wearable olfactory display as an amplification for mobile notifications , 2017, SEMWEB.

[58]  Charles E. Hughes,et al.  The Effects of Indirect Real Body Cues of Irrelevant Parts on Virtual Body Ownership and Presence , 2016, ICAT-EGVE.