Situated Visualization in Augmented Reality: Exploring Information Seeking Strategies

In recent years augmented reality applications have been increasingly demonstrating the requirement for an interaction with information related to and directly shown in the surrounding environment. Situated information is visualized in its semantic and spatial context, building up an environment enhanced by an information level that dynamically adapts to the production of the information and to the actions of the user. The exploration and manipulation of this type of data through see-through augmented reality devices still represents a challenging task. The development of specific interaction strategies capable to mitigating the current limitations of augmented reality devices is essential. In this context, our contribution has been to design possible solutions to address some of these challenges allowing a dynamic interaction with situated information. Following the visual "information-seeking mantra" proposed by Shneiderman and introducing some "superpowers" for the users, in this work we present different strategies aimed at obtaining an overview and filtering, and acquiring details of a collection of situated data.

[1]  Dieter Schmalstieg,et al.  Image-driven view management for augmented reality browsers , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[2]  Beatriz Sousa Santos,et al.  Situated Visualization in The Decision Process Through Augmented Reality , 2019, 2019 23rd International Conference Information Visualisation (IV).

[3]  Mihran Tuceryan,et al.  Automatic determination of text readability over textured backgrounds for augmented reality systems , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[4]  Joseph S. Dumas,et al.  Comparison of three one-question, post-task usability questionnaires , 2009, CHI.

[5]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[6]  Ross T. Smith,et al.  Situated Analytics , 2018, Immersive Analytics.

[7]  Doug A. Bowman,et al.  Walking with adaptive augmented reality workspaces: design and usage patterns , 2019, IUI.

[8]  Won-Ki Jeong,et al.  DXR: A Toolkit for Building Immersive Data Visualizations , 2019, IEEE Transactions on Visualization and Computer Graphics.

[9]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[10]  Pierre Dragicevic,et al.  Embedded Data Representations , 2017, IEEE Transactions on Visualization and Computer Graphics.

[11]  Sean White,et al.  Virtual Vouchers: Prototyping a Mobile Augmented Reality User Interface for Botanical Species Identification , 2006, 3D User Interfaces (3DUI'06).

[12]  Jean-Pierre Jessel,et al.  Adaptive augmented reality: plasticity of augmentations , 2014, VRIC.

[13]  Steven K. Feiner,et al.  Perceptual issues in augmented reality revisited , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[14]  Luigi Gallo,et al.  User-Driven View Management for Wearable Augmented Reality Systems in the Cultural Heritage Domain , 2015, 2015 10th International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC).

[15]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[16]  Dieter Schmalstieg,et al.  Hedgehog labeling: View management techniques for external labels in 3D space , 2014, 2014 IEEE Virtual Reality (VR).

[17]  Sean White,et al.  SiteLens: situated visualization techniques for urban site visits , 2009, CHI.

[18]  Maria Frucci,et al.  Touchless Target Selection Techniques for Wearable Augmented Reality Systems , 2015 .

[19]  Sean White,et al.  Interaction and presentation techniques for situated visualization , 2009 .

[20]  J. Bailenson,et al.  Virtual Superheroes: Using Superpowers in Virtual Reality to Encourage Prosocial Behavior , 2013, PloS one.