Exploration of Techniques for Rapid Activation of Glanceable Information in Head-Worn Augmented Reality
暂无分享,去创建一个
[1] Stephen DiVerdi,et al. Level of detail interfaces , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.
[2] Chris Harrison,et al. Lean and zoom: proximity-aware user interface and content magnification , 2008, CHI.
[3] Andreas Bulling,et al. A Design Space for Gaze Interaction on Head-mounted Displays , 2019, CHI.
[4] Paul Milgram,et al. Perceptual issues in augmented reality , 1996, Electronic Imaging.
[5] Kin K. Leung,et al. Context-Awareness for Mobile Sensing: A Survey and Future Directions , 2016, IEEE Communications Surveys & Tutorials.
[6] Mark Billinghurst,et al. Gaze window: A new gaze interface showing relevant content close to the gaze point , 2020, Journal of the Society for Information Display.
[7] S. Tipper,et al. Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.
[8] Sandra G. Hart,et al. Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .
[9] Ying-Chao Tung,et al. User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.
[10] Holger Regenbrecht,et al. Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality , 2017, IEEE Transactions on Visualization and Computer Graphics.
[11] Roel Vertegaal,et al. Attentive User Interfaces , 2003 .
[12] Margrit Betke,et al. Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.
[13] Arindam Dey,et al. Estimating Gaze Depth Using Multi-Layer Perceptron , 2017, 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).
[14] Steven K. Feiner,et al. Augmented reality: a new way of seeing. , 2002, Scientific American.
[15] Weiyuan Liu,et al. Natural user interface- next mainstream product user interface , 2010, 2010 IEEE 11th International Conference on Computer-Aided Industrial Design & Conceptual Design 1.
[16] Mark Billinghurst,et al. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.
[17] Doug A. Bowman,et al. Evaluating the Potential of Glanceable AR Interfaces for Authentic Everyday Uses , 2021, 2021 IEEE Virtual Reality and 3D User Interfaces (VR).
[18] Yomna Abdelrahman,et al. Exploring the Potential of Augmented Reality in Domestic Environments , 2019, MobileHCI.
[19] Doug A. Bowman,et al. Occlusion Management Techniques for Everyday Glanceable AR Interfaces , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW).
[20] Max Mühlhäuser,et al. Introduction to Ubiquitous Computing , 2008, Handbook of Research on Ubiquitous Computing Technology for Real Time Enterprises.
[21] Doug A. Bowman,et al. Glanceable AR: Evaluating Information Access Methods for Head-Worn Augmented Reality , 2020, 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
[22] Enes Yigitbas,et al. Development framework for context-aware augmented reality applications , 2020, EICS.
[23] Tianyu Zhang,et al. DepthMove: Leveraging Head Motions in the Depth Dimension to Interact with Virtual Reality Head-Worn Displays , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[24] Pourang Irani,et al. Are you comfortable doing that?: acceptance studies of around-device gestures in and for public settings , 2014, MobileHCI '14.
[25] R. Johansson,et al. Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.
[26] Florian Alt,et al. Looking for Info: Evaluation of Gaze Based Information Retrieval in Augmented Reality , 2021, INTERACT.
[27] Klaus-Peter Hoffmann,et al. Target selection in eye–hand coordination: Do we reach to where we look or do we look to where we reach? , 2005, Experimental Brain Research.
[28] Roel Vertegaal. Introduction , 2003, CACM.
[29] Philipp Slusallek,et al. Predicting the gaze depth in head-mounted displays using multiple feature regression , 2018, ETRA.
[30] Florian Alt,et al. ARtention: A design space for gaze-adaptive user interfaces in augmented reality , 2021, Comput. Graph..
[31] Hai-Ning Liang,et al. Pointing and Selection Methods for Text Entry in Augmented Reality Head Mounted Displays , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[32] Xiang Li,et al. Exploration of Hands-free Text Entry Techniques For Virtual Reality , 2020, 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[33] J. B. Brooke,et al. SUS: A 'Quick and Dirty' Usability Scale , 1996 .
[34] Jong-Soo Choi,et al. Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
[35] Stephen DiVerdi,et al. ARWin - a desktop augmented reality Window Manager , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..
[36] Doug A. Bowman,et al. Walking with adaptive augmented reality workspaces: design and usage patterns , 2019, IUI.
[37] Robert Xiao,et al. Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions , 2015, ICMI.
[38] Marc Erich Latoschik,et al. Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments , 2008, J. Virtual Real. Broadcast..
[39] Aunnoy K. Mutasim,et al. Pinch, Click, or Dwell: Comparing Different Selection Techniques for Eye-Gaze-Based Pointing in Virtual Reality , 2021, ETRA Short Papers.
[40] Robert J. K. Jacob,et al. What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.
[41] Ann McNamara,et al. Information Placement in Virtual Reality , 2019, 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
[42] David Lindlbauer,et al. Context-Aware Online Adaptation of Mixed Reality Interfaces , 2019, UIST.
[43] Kaisa Väänänen,et al. Exploring the augmented home window: user perceptions of the concept , 2014, MUM.
[44] Florian Alt,et al. StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality , 2020, ETRA Adjunct.