Dynamic text management for see-through wearable and heads-up display systems

Reading text safely and easily while mobile has been an issue with see-through displays for many years. For example, in order to effectively use optical see through Head Mounted Displays (HMDs) or Heads Up Display (HUD) systems in constantly changing dynamic environments, variables like lighting conditions, human or vehicular obstructions in a user's path, and scene variation must be dealt with effectively. This paper introduces a new intelligent text management system that actively manages movement of text in a user's field of view. Research to date lacks a method to migrate user-centric content such as e-mail or text messages throughout a user's environment while mobile. Unlike most current annotation and view management systems, we use camera tracking to find dark, uniform regions along the route on which a user is travelling in real time. We then implement methodology to move text from one viable location to the next to maximize readability. A pilot experiment with 19 participants shows that the text placement of our system is preferred to text in fixed location configurations.

[1]  Stephen DiVerdi,et al.  Annotation in outdoor augmented reality , 2009, Comput. Graph..

[2]  Mihran Tuceryan,et al.  Automatic determination of text readability over textured backgrounds for augmented reality systems , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[3]  Mark Billinghurst,et al.  A wearable spatial conferencing space , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[4]  M. Mon-Williams,et al.  Binocular vision in a virtual world: visual deficits following the wearing of a head‐mounted display , 1993, Ophthalmic & physiological optics : the journal of the British College of Ophthalmic Opticians.

[5]  Tobias Höllerer,et al.  Semi-automated Placement of Annotations in Videos , 2004 .

[6]  Doug A. Bowman,et al.  Testbed evaluation of navigation and text display techniques in an information-rich virtual environment , 2004 .

[7]  Lauren F. V. Scharff,et al.  Discriminability measures for predicting readability , 1999, Electronic Imaging.

[8]  Gudrun Klinker,et al.  Gestyboard : A 10-finger-system and gesture based text input system for multi-touchscreens with no need for tactile feedback , 2011 .

[9]  Robert S. Bolia,et al.  ASSESSING SIMULATOR SICKNESS IN A SEE-THROUGH HMD: EFFECTS OF TIME DELAY, TIME ON TASK, AND TASK COMPLEXITY , 2000 .

[10]  Naokazu Yokoya,et al.  View management of annotations for wearable augmented reality , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[11]  Tsutomu Terada,et al.  An information layout method for an optical see-through head mounted display focusing on the viewability , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[12]  Deborah Hix,et al.  An empirical user-based study of text drawing styles and outdoor background textures for augmented reality , 2005, IEEE Proceedings. VR 2005. Virtual Reality, 2005..

[13]  Kent Lyons,et al.  Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.

[14]  Jürgen Döllner,et al.  Dynamic Annotation of Interactive Environments using Object-Integrated Billboards , 2006 .

[15]  Jens Grubert,et al.  Perceptual issues in optical-see-through displays , 2010, APGV '10.

[16]  Steven K. Feiner,et al.  View management for virtual and augmented reality , 2001, UIST '01.

[17]  Ronald Azuma,et al.  Evaluating label placement for augmented reality view management , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[18]  Stefan Decker,et al.  Integrating Text with Video and 3D Graphics: The Effects of Text Drawing Styles on Text Readability , 2010, CHI.