User Expectations of Everyday Gaze Interaction on Smartglasses
暂无分享,去创建一个
Andrés Lucero | Roope Raisamo | Tero Jokela | Deepak Akkil | Jari Kangas | Marja Salmimaa | J. Kangas | R. Raisamo | A. Lucero | M. Salmimaa | Deepak Akkil | T. Jokela
[1] Kai Kunze,et al. Who are you?: A wearable face recognition system to support human memory , 2013, AH.
[2] HoltzblattKaren,et al. Rapid Contextual Design , 2005 .
[3] Hans-Werner Gellersen,et al. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.
[4] Päivi Majaranta,et al. Twenty years of eye typing: systems and design issues , 2002, ETRA.
[5] Päivi Majaranta,et al. Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .
[6] Viljakaisa Aaltonen,et al. Compact near-to-eye display with integrated gaze tracker , 2008, SPIE Photonics Europe.
[7] Andrés Lucero,et al. NotifEye: using interactive glasses to deal with notifications while walking in public , 2014, Advances in Computer Entertainment.
[8] R.I.A. Mercuri,et al. Technology as Experience , 2005, IEEE Transactions on Professional Communication.
[9] Arantxa Villanueva,et al. Safety Issues and Infrared Light , 2012 .
[10] Andrea Lockerd Thomaz,et al. Eye-R, a glasses-mounted eye motion detection interface , 2001, CHI Extended Abstracts.
[11] Raimund Dachselt,et al. Look & touch: gaze-supported target acquisition , 2012, CHI.
[12] Hans-Werner Gellersen,et al. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.
[13] Jakob Grue Simonsen,et al. An Exploration of the Relation Between Expectations and User Experience , 2015, Int. J. Hum. Comput. Interact..
[14] Albrecht Schmidt,et al. Eye-gaze interaction for mobile phones , 2007, Mobility '07.
[15] Roope Raisamo,et al. Gaze gestures and haptic feedback in mobile devices , 2014, CHI.
[16] Roope Raisamo,et al. TraQuMe: a tool for measuring the gaze tracking quality , 2014, ETRA.
[17] Peter Fröhlich,et al. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration , 2010, AH.
[18] Shumin Zhai,et al. Conversing with the user based on eye-gaze patterns , 2005, CHI.
[19] Jong-Soo Choi,et al. Design and implementation of an augmented reality system using gaze interaction , 2011, Multimedia Tools and Applications.
[20] Sean White,et al. Exploring the interaction design space for interactive glasses , 2013, CHI Extended Abstracts.
[21] John Paulin Hansen,et al. Gaze input for mobile devices by dwell and gestures , 2012, ETRA.
[22] Randy F. Pausch,et al. Designing A Successful HMD-Based Experience , 1999, Presence.
[23] Poika Isokoski,et al. Text input methods for eye trackers using off-screen targets , 2000, ETRA.
[24] Andreas Dengel,et al. Text 2.0 , 2010, CHI EA '10.
[25] Kaisa Väänänen,et al. Expected user experience of mobile augmented reality services: a user study in the context of shopping centres , 2011, Personal and Ubiquitous Computing.
[26] Karen Holtzblatt,et al. Rapid Contextual Design: A How-To Guide to Key Techniques for User-Centered Design , 2004, UBIQ.
[27] Marianne LaFrance,et al. Cultural aspects of nonverbal communication , 1978 .
[28] Howell O. Istance,et al. Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.
[29] Marc Hassenzahl,et al. User experience - a research agenda , 2006, Behav. Inf. Technol..
[30] Roope Raisamo,et al. Glance Awareness and Gaze Interaction in Smartwatches , 2015, CHI Extended Abstracts.
[31] Yusuke Sugano,et al. Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.
[32] Hans-Werner Gellersen,et al. Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.
[33] Andy Cockburn,et al. User-defined gestures for augmented reality , 2013, INTERACT.
[34] Toni Järvenpää,et al. Highly integrated near-to-eye display and gaze tracker , 2010, Photonics Europe.