Size Perception of Augmented Objects by Different AR Displays

Augmented reality (AR) has positioned itself as one of main media technologies, and its further proliferation depends on its usability, ergonomics and perceptual qualities with respect to the effective and correct convey of information. However, there exists no established guideline as how to visualize augmented information properly for various types of AR displays. We investigate for one important perceptual quality in AR, correct size perception of the augmented object, for three representative AR displays. The AR displays considered are: (1) small hand-held mobile device, (2) closed video see-through HMD, and (3) optical see-through HMD. The augmented object, a nominally sized box, is visualized in three different styles: (1) as a texture mapped simple polygonal model, (2) as a bump mapped polygonal model with shadow, and (3) as a detailed scanned mesh model. The size perception was assessed in two viewing angles: (1) sitting down and looking straight and (2) standing and looking down in 45 degrees (1.5 m distance). The experimental results revealed significant effects by the display type, rendering style and viewing angle. E.g. users tended to overestimate the object size and took longer to complete the task, when the small hand-held display is used. We believe the findings can serve as one guideline for developing effective AR applications.

[1]  Mark Billinghurst,et al.  Designing augmented reality interfaces , 2005, COMG.

[2]  Hirokazu Kato,et al.  The effects of shadow representation of virtual objects in augmented reality , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[3]  Jyh-Chong Liang,et al.  Current status, opportunities and challenges of augmented reality in education , 2013, Comput. Educ..

[4]  Christian Sandor,et al.  Improving Spatial Perception for Augmented Reality X-Ray Vision , 2009, 2009 IEEE Virtual Reality Conference.

[5]  Steven K. Feiner,et al.  Perceptual issues in augmented reality revisited , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[6]  Jack M. Loomis,et al.  Visual perception of egocentric distance in real and virtual environments. , 2003 .

[7]  Danielle Albers Szafir,et al.  Designing for Depth Perceptions in Augmented Reality , 2017, 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[8]  Ramesh Raskar,et al.  Augmented Reality Visualization for Laparoscopic Surgery , 1998, MICCAI.

[9]  Andrew Vande Moere,et al.  On the role of design in information visualization , 2011, Inf. Vis..

[10]  Tobias Höllerer,et al.  Resolving multiple occluded layers in augmented reality , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[11]  Andras Kemeny,et al.  1:1 Scale Perception in Virtual and Augmented Reality , 2008 .

[12]  Ronald Azuma,et al.  Augmented-reality visualizations guided by cognition: perceptual heuristics for combining visible and obscured information , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[13]  Kaisa Väänänen,et al.  Expected user experience of mobile augmented reality services: a user study in the context of shopping centres , 2011, Personal and Ubiquitous Computing.

[14]  Dieter Schmalstieg,et al.  Dynamic compact visualizations for augmented reality , 2013, 2013 IEEE Virtual Reality (VR).

[15]  Rod McCall,et al.  Guidelines for designing augmented reality games , 2008, Future Play.

[16]  Klaus Bengler,et al.  Augmented Reality in Cars Requirements and Constraints Invited Speech , 2006 .

[17]  Ronald Azuma,et al.  Recent Advances in Augmented Reality , 2001, IEEE Computer Graphics and Applications.