Layerable Apps: Comparing Concurrent and Exclusive Display of Augmented Reality Applications

Current augmented reality (AR) interfaces are often designed for interacting with one application at a time, significantly limiting a user’s ability to concurrently interact with and switch between multiple applications or modalities that could run in parallel. In this work, we introduce an application model called Layerable Apps, which supports a variety of AR application types while enabling multitasking through concurrent execution, fast application switching, and the ability to layer application views to adjust the degree of augmentation to the user’s preference. We evaluated Layerable Apps through a within-subjects user study (n=44), compared against a traditional single-focus application model on a split-information task involving the simultaneous use of multiple applications. We report the results of our study, where we found differences in quantitative task performance, favoring Layerable mode. We also analyzed app usage patterns, spatial awareness, and overall preferences between both modes as well as between experienced and novice AR users.

[1]  Rüdiger Zarnekow,et al.  The Use of Augmented Reality in Retail: A Review of Literature , 2021, HICSS.

[2]  Dieter Schmalstieg,et al.  Evaluating Mixed and Augmented Reality: A Systematic Literature Review (2009-2019) , 2020, 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[3]  Jesús Omar Álvarez Márquez,et al.  In-Store Augmented Reality-Enabled Product Comparison and Recommendation , 2020, RecSys.

[4]  Andrea Bunt,et al.  Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities , 2020, CHI.

[5]  Doug A. Bowman,et al.  Glanceable AR: Evaluating Information Access Methods for Head-Worn Augmented Reality , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[6]  Charalambos Poullis,et al.  Visualizing and Interacting with Hierarchical Menus in Immersive Augmented Reality , 2019, VRCAI.

[7]  David Lindlbauer,et al.  Context-Aware Online Adaptation of Mixed Reality Interfaces , 2019, UIST.

[8]  Yomna Abdelrahman,et al.  Exploring the Potential of Augmented Reality in Domestic Environments , 2019, MobileHCI.

[9]  Yoichi Ochiai,et al.  AR Cooking: Comparing Display Methods for the Instructions of Cookwares on AR Goggles , 2019, HCI.

[10]  Tobias Höllerer,et al.  In-Situ Labeling for Augmented Reality Language Learning , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[11]  Tadayoshi Kohno,et al.  Enabling Multiple Applications to Simultaneously Augment Reality: Challenges and Directions , 2019, HotMobile.

[12]  Tobias Höllerer,et al.  ARbis Pictus: A Study of Vocabulary Learning with Augmented Reality , 2017, IEEE Transactions on Visualization and Computer Graphics.

[13]  Holger Regenbrecht,et al.  Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality , 2017, IEEE Transactions on Visualization and Computer Graphics.

[14]  Dieter Schmalstieg,et al.  Adaptive information density for augmented reality displays , 2016, 2016 IEEE Virtual Reality (VR).

[15]  Pan Hui,et al.  Ubii: Towards Seamless Interaction between Digital and Physical Worlds , 2015, ACM Multimedia.

[16]  Daniel Sonntag,et al.  Halo Content: Context-aware Viewspace Management for Non-invasive Augmented Reality , 2015, IUI.

[17]  Dieter Schmalstieg,et al.  Hedgehog labeling: View management techniques for external labels in 3D space , 2014, 2014 IEEE Virtual Reality (VR).

[18]  Dieter Schmalstieg,et al.  Image-driven view management for augmented reality browsers , 2012, 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[19]  Takuji Narumi,et al.  Augmented perception of satiety: controlling food consumption by changing apparent size of food with augmented reality , 2012, CHI.

[20]  Sabarish V. Babu,et al.  Evaluation of the Cognitive Effects of Travel Technique in Complex Real and Virtual Environments , 2010, IEEE Transactions on Visualization and Computer Graphics.

[21]  D. Carmel,et al.  Spatial Attention Can Modulate Unconscious Orientation Processing , 2008, Perception.

[22]  Bruce H. Thomas,et al.  Augmented reality in-situ 3D model menu for outdoors , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[23]  James T. Miller,et al.  An Empirical Evaluation of the System Usability Scale , 2008, Int. J. Hum. Comput. Interact..

[24]  Heedong Ko,et al.  "Move the couch where?" : developing an augmented reality multimodal interface , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[25]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[26]  L. Harris,et al.  Visual and non-visual cues in the perception of linear self motion , 2000, Experimental Brain Research.

[27]  Steven K. Feiner,et al.  Information filtering for mobile augmented reality , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[28]  Thad Starner,et al.  Human-Powered Wearable Computing , 1996, IBM Syst. J..

[29]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[30]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[31]  Dima Damen,et al.  You-Do, I-Learn: Discovering Task Relevant Objects and their Modes of Interaction from Multi-User Egocentric Video , 2014, BMVC.