A UWB-Driven Self-Actuated Projector Platform for Interactive Augmented Reality Applications

With the rapid development of interactive technology, creating systems that allow users to define their interactive envelope freely and provide multi-interactive modalities is important to build up an intuitive interactive space. We present an indoor interactive system where a human can customize and interact through a projected screen utilizing the surrounding surfaces. An ultra-wideband (UWB) wireless sensor network was used to assist human-centered interaction design and navigate the self-actuated projector platform. We developed a UWB-based calibration algorithm to facilitate the interaction with the customized projected screens, where a hand-held input device was designed to perform mid-air interactive functions. Sixteen participants were recruited to evaluate the system performance. A prototype level implementation was tested inside a simulated museum environment, where a self-actuated projector provides interactive explanatory content for the on-display artifacts under the user’s command. Our results depict the applicability to designate the interactive screen efficiently indoors and interact with the augmented content with reasonable accuracy and relatively low workload. Our findings also provide valuable user experience information regarding the design of mobile and projection-based augmented reality systems, with the ability to overcome the limitations of other conventional techniques.

[1]  Markus Funk,et al.  Survey of Interactive Displays through Mobile Projections , 2016, Int. J. Mob. Hum. Comput. Interact..

[2]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[3]  Alberto Jaspe Villanueva,et al.  IsoCam: Interactive Visual Exploration of Massive Cultural Heritage Models on Large Projection Setups , 2014, JOCCH.

[4]  Ross T. Smith,et al.  Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing , 2016, The International Journal of Advanced Manufacturing Technology.

[5]  H. Yamazoe,et al.  Projection mapping onto multiple objects using a projector robot , 2018 .

[6]  Carl T. Haas,et al.  Impact of augmented reality and spatial cognition on assembly in construction , 2019 .

[7]  Tack-Don Han,et al.  PPAP: Perspective Projection Augment Platform with Pan–Tilt Actuation for Improved Spatial Perception , 2019, Sensors.

[8]  Joseph A. Paradiso,et al.  Sensor systems for interactive surfaces , 2000, IBM Syst. J..

[9]  Christian Dindler,et al.  Staging imaginative places for participatory prototyping , 2008 .

[10]  Shuang-Hua Yang,et al.  A Survey of Indoor Positioning and Object Locating Systems , 2010 .

[11]  David Alejandro,et al.  Projection-Based Augmented Reality Assistance for Manual Electronic Component Assembly Processes , 2020, Applied Sciences.

[12]  Philip T. Kortum,et al.  Determining what individual SUS scores mean: adding an adjective rating scale , 2009 .

[13]  Li Li,et al.  The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research , 2017, i-Perception.

[14]  Itaru Kitahara,et al.  FUTUREGYM: A gymnasium with interactive floor projection for children with special needs , 2017, Int. J. Child Comput. Interact..

[15]  F. Dickmann,et al.  Augmented Reality (AR) and Spatial Cognition: Effects of Holographic Grids on Distance Estimation and Location Memory in a 3D Indoor Scenario , 2020, PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science.