A Portable Device for Five Sense Augmented Reality Experiences in Museums

Augmented reality (AR) nowadays is focused in two senses, sight and hearing. The work presented here is part of the Mobile Five Senses Augmented Reality system for Museums (M5SAR) project, which has the goal of developing an AR system to be a guide in cultural and historical events and museums, complementing or replacing traditional guides, directional signs or maps, while enhancing the user’s experience by adding multisensorial information to multiple museum objects. The existing solutions for this type of devices either lack portability or fail to implement all the human senses at the same time. This paper presents a new device capable of extending augmented reality experiences to all five human senses, through the use of a portable device that can reproduce stimulus of touch, taste and smell. The proposed apparatus is meant to be paired with a mobile application that controls which sensorial interface is activated and when, relaying that information to the portable device. The application, running on the user’s smartphone or tablet, sends the activation instructions via Bluetooth, using a communication protocol, which is then received by the device’s core microcontroller. Then, the microcontroller acts accordingly, activating the requested physical interfaces to deliver the multisensorial media to the user

[1]  A. Kheddar,et al.  Thermal feedback model for virtual reality , 2003, MHS2003. Proceedings of 2003 International Symposium on Micromechatronics and Human Science (IEEE Cat. No.03TH8717).

[2]  Sriram Subramanian,et al.  Adding haptic feedback to mobile tv , 2011, CHI Extended Abstracts.

[3]  Ellen Yi-Luen Do,et al.  Digital Lollipop , 2016, ACM Trans. Multim. Comput. Commun. Appl..

[4]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[5]  Riccardo Polosa,et al.  Safety evaluation and risk assessment of electronic cigarettes as tobacco cigarette substitutes: a systematic review , 2014, Therapeutic advances in drug safety.

[6]  Takuji Narumi,et al.  Pseudo-gustatory display system based on cross-modal integration of vision, olfaction and gustation , 2011, 2011 IEEE Virtual Reality Conference.

[7]  Hiroshi Ishida,et al.  Smelling Screen: Development and Evaluation of an Olfactory Display System for Presenting a Virtual Odor Source , 2013, IEEE Transactions on Visualization and Computer Graphics.

[8]  Ryohei Nakatsu,et al.  Tongue Mounted Interface for Digitally Actuating the Sense of Taste , 2012, 2012 16th International Symposium on Wearable Computers.

[9]  Renata Rybarova,et al.  Application of immersive technologies for education: State of the art , 2015, 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL).

[10]  Tomohiro Tanikawa,et al.  A Study of Multi-modal Display System with Visual Feedback , 2008, 2008 Second International Symposium on Universal Communication.

[11]  Gabriel-Miro Muntean,et al.  Perceived Synchronization of Mulsemedia Services , 2015, IEEE Transactions on Multimedia.

[12]  Gerard Jounghyun Kim,et al.  Design and evaluation of a wind display for virtual reality , 2004, VRST '04.

[13]  Joseph Nathaniel Kaye,et al.  Symbolic olfactory display , 2001 .

[14]  Mark J.P. Wolf,et al.  The video game explosion : a history from PONG to PlayStation and beyond , 2008 .

[15]  Sriram Subramanian,et al.  Mid-Air Haptics and Displays: Systems for Un-instrumented Mid-air Interactions , 2016, CHI Extended Abstracts.

[16]  Tomohiro Tanikawa,et al.  Wearable Olfactory Display: Using Odor in Outdoor Environment , 2006, IEEE Virtual Reality Conference (VR 2006).

[17]  Adrian David Cheok,et al.  Electronic taste stimulation , 2011, UbiComp '11.

[18]  Chris Harrison,et al.  Providing dynamically changeable physical buttons on a visual display , 2009, CHI.