There are currently two primary ways of viewing location specific information in-situ on hand-held mobile device screens: using a see-through augmented reality interface and using a touch-based interface with panoramas. The two approaches use fundamentally different interaction metaphors: an AR-style of interacting where the user holds up the device and physically moves it to change views of the world, and a touch-based technique where panorama navigation is independent of the physical world. We have investigated how this difference in interaction technique impacts a user's spatial understanding of the mixed reality world. Our study found that AR-style interaction provided better spatial understanding overall, while touch-based interaction changed the experience to have more similar characteristics to interaction in a separate virtual environment.
[1]
Bob G. Witmer,et al.
Judging Perceived and Traversed Distance in Virtual Environments
,
1998,
Presence.
[2]
Peter J. Werkhoven,et al.
Aiding orientation performance in virtual environments with proprioceptive feedback
,
1998,
Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180).
[3]
Yun-Ta Tsai,et al.
Indirect augmented reality
,
2011,
Comput. Graph..
[4]
M. S. Masters,et al.
Is the gender difference in mental rotation disappearing?
,
1993,
Behavior genetics.
[5]
Jack M. Loomis,et al.
Visual perception of egocentric distance in real and virtual environments.
,
2003
.