Analysis of information provision methods for indoor navigation using interactive floor projection

In recent years, indoor navigation has attracted attention, but it is usually difficult to apply existing outdoor navigation systems to indoor due to lack of an indoor positioning system. In addition, although navigation applications are popular in outdoor navigation, watching a display of a smartphone or tablet while walking may cause a collision accident in a crowded indoor environment. So it is also necessary to consider how to provide guide information to users. In this study, we develop an interactive floor system using a laser range finder based tracking system and projectors and use it for indoor navigation. The system projects markers on the floor depending on the user position and leads the user to the goal points. This paper especially focuses on an analysis of basic characteristics of human navigation using images projected on the floor. Two approaches to move the markers to the destination are presented. Effects on design parameters in each approach when human walks straight are evaluated thorough experiments.

[1]  Martin Ludvigsen,et al.  Floor interaction HCI reaching new ground , 2005, CHI Extended Abstracts.

[2]  Joo-Ho Lee Human Centered Ubiquitous Display in Intelligent Space , 2007, IECON 2007 - 33rd Annual Conference of the IEEE Industrial Electronics Society.

[3]  N Sekiya,et al.  The invariant relationship between step length and step rate during free walking , 1996 .

[4]  Heesung Jun,et al.  Vision-based location positioning using augmented reality for indoor navigation , 2008, IEEE Transactions on Consumer Electronics.

[5]  Wolfram Burgard,et al.  Monte Carlo localization for mobile robots , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[6]  Henrik I. Christensen,et al.  Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people , 2014, 2014 IEEE Haptics Symposium (HAPTICS).

[7]  Shuoyu Wang,et al.  Trajectory planning method of guide robots for a achieving the guidance , 2005, 2005 IEEE International Conference on Robotics and Biomimetics - ROBIO.

[8]  Yoshio Iwai,et al.  AR Navigation System Using Interaction with a CG Avatar , 2014, HCI.

[9]  N. Ando,et al.  Occlusion Avoidance of Information Display System in Intelligent Space , 2006, 2006 SICE-ICASE International Joint Conference.

[10]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[11]  Kazuto Hayashida,et al.  CHARACTERISTIC BEHAVIOR IN FOLLOW-WALKING TO LEADING ROBOTS , 2010 .

[12]  Dieter Schmalstieg,et al.  Structured visual markers for indoor pathfinding , 2002, The First IEEE International Workshop Agumented Reality Toolkit,.

[13]  Andreas Komninos,et al.  Feature-Based Indoor Navigation Using Augmented Reality , 2013, 2013 9th International Conference on Intelligent Environments.

[14]  Takayuki Kanda,et al.  Field trial of networked social robots in a shopping mall , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Martin Ludvigsen,et al.  "Help Me Pull That Cursor" A Collaborative Interactive Floor Enhancing Community Interaction , 2004, Australas. J. Inf. Syst..

[16]  Joseph J. LaViola,et al.  Hands-free multi-scale navigation in virtual environments , 2001, I3D '01.

[17]  H. Hashimoto,et al.  Acting in intelligent space - mobile robot control based on sensors distributed in space - , 2007, 2007 IEEE/ASME international conference on advanced intelligent mechatronics.

[18]  Joseph A. Paradiso,et al.  The magic carpet: physical sensing for immersive environments , 1997, CHI Extended Abstracts.

[19]  Hideki Hashimoto,et al.  Leading a Person Using Ethologically Inspired Autonomous Robot Behavior , 2015, HRI.