Multimodal Gaze Interaction for Creative Design

We present a new application ("Sakura") that enables people with physical impairments to produce creative visual design work using a multimodal gaze approach. The system integrates multiple features tailored for gaze interaction including the selection of design artefacts via a novel grid approach, control methods for manipulating canvas objects, creative typography, a new color selection approach, and a customizable guide technique facilitating the alignment of design elements. A user evaluation (N=24) found that non-disabled users were able to utilize the application to complete common design activities and that they rated the system positively in terms of usability. A follow-up study with physically impaired participants (N=6) demonstrated they were able to control the system when working towards a website design, rating the application as having a good level of usability. Our research highlights new directions in making creative activities more accessible for people with physical impairments.

[1]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[2]  André Meyer,et al.  Conception and development of an accessible application for producing images by gaze interaction-EyeArt , 2007 .

[3]  Sidney S. Fels,et al.  Enhancing Zoom and Pan in Ultrasound Machines with a Multimodal Gaze-based Interface , 2017, CHI Extended Abstracts.

[4]  Roope Raisamo,et al.  Haptic feedback in eye typing , 2016 .

[5]  I. Scott MacKenzie,et al.  BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.

[6]  Sayan Sarcar,et al.  EyeBoard: A fast and accurate eye gaze-based text entry system , 2012, 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).

[7]  Anthony J. Hornof,et al.  EyeDraw: enabling children with severe motor impairments to draw with their eyes , 2005, CHI.

[8]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[9]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[10]  Hans-Werner Gellersen,et al.  Gaze and Touch Interaction on Tablets , 2016, UIST.

[11]  Gerhard Rigoll,et al.  GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios , 2017, CHI.

[12]  Xiaoyu Zhao,et al.  Typing with eye-gaze and tooth-clicks , 2012, ETRA.

[13]  Giovanni Spagnoli,et al.  ceCursor, a contextual eye cursor for general pointing in windows environments , 2010, ETRA.

[14]  Jukka Lekkala,et al.  Text Entry by Gazing and Smiling , 2013, Adv. Hum. Comput. Interact..

[15]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[16]  Veronica Sundstedt,et al.  Gaze and voice controlled drawing , 2011, NGCA '11.

[17]  Wendy E. Mackay,et al.  Color Portraits: From Color Picking to Interacting with Color , 2015, CHI.

[18]  Howell O. Istance,et al.  Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.

[19]  Andreas Dengel,et al.  The eyeBook – Using Eye Tracking to Enhance the Reading Experience , 2010, Informatik-Spektrum.

[20]  Sayan Sarcar,et al.  Design and evaluation of a dwell-free eye typing technique , 2014, CHI Extended Abstracts.

[21]  Andrew Sears,et al.  Speech-based cursor control using grids: modelling performance and comparisons with other solutions , 2005, Behav. Inf. Technol..

[22]  John Paulin Hansen,et al.  A comparative usability study of two Japanese gaze typing systems , 2006, ETRA.

[23]  Hans-Werner Gellersen,et al.  Gaze-touch: combining gaze with multi-touch for interaction on the same surface , 2014, UIST.

[24]  Geehyuk Lee,et al.  Design of a shape dependent snapping algorithm , 2012, CHI EA '12.

[25]  Henna Heikkilä,et al.  Tools for a Gaze-Controlled Drawing Application - Comparing Gaze Gestures against Dwell Buttons , 2013, INTERACT.

[26]  Cheng Zhang,et al.  An Eye-Gaze Tracking and Human Computer Interface System for People with ALS and other Locked-in Diseases , 2012 .

[27]  Yanxia Zhang,et al.  Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze , 2015, UIST.

[28]  James A. Landay,et al.  A study of blind drawing practice: creating graphical information without the visual channel , 2000, Assets '00.

[29]  John Paulin Hansen,et al.  Gaze interaction from bed , 2011, NGCA '11.

[30]  Marco Porta,et al.  WeyeB, an eye-controlled Web browser for hands-free navigation , 2009, 2009 2nd Conference on Human System Interactions.

[31]  Keith S. Karn,et al.  Commentary on Section 4. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. , 2003 .

[32]  Christof Lutteroth,et al.  Designing for the eye: design parameters for dwell in gaze interaction , 2012, OZCHI.

[33]  I. Scott MacKenzie,et al.  Speech-augmented eye gaze interaction with small closely spaced targets , 2006, ETRA.

[34]  Henna Heikkilä EyeSketch: a drawing application for gaze control , 2013, ETSA '13.

[35]  John M. Flach,et al.  Small-target selection with gaze alone , 2010, ETRA '10.

[36]  Shaojian Zhu,et al.  Speech-Based Navigation: Improving Grid-Based Solutions , 2009, INTERACT.

[37]  John Paulin Hansen,et al.  Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets , 2011, Behav. Inf. Technol..

[38]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[39]  Philip T. Kortum,et al.  Determining what individual SUS scores mean: adding an adjective rating scale , 2009 .

[40]  Halil Erhan,et al.  WebSight: The Use of the Grid-Based Interface to Convey Layout of Web-Pages in a Non-visual Environment , 2013, HCI.

[41]  Sanja Fidler,et al.  Color Builder: A Direct Manipulation Interface for Versatile Color Theme Authoring , 2019, CHI.

[42]  KortumPhilip,et al.  Determining what individual SUS scores mean , 2009 .

[43]  Kari-Jouko Räihä,et al.  Gaze-contingent scrolling and reading patterns , 2014, NordiCHI.

[44]  Susan T. Dumais,et al.  Understanding gaze and scrolling strategies in text consumption tasks , 2015, UbiComp/ISWC Adjunct.

[45]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[46]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[47]  Kari-Jouko Räihä,et al.  An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload , 2012, CHI.

[48]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[49]  Ravi Kuber,et al.  A grid-based extension to an assistive multimodal interface , 2007, CHI Extended Abstracts.

[50]  Sayan Sarcar,et al.  Eyeboard++: an enhanced eye gaze-based text entry system in Hindi , 2013, APCHI.

[51]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[52]  Carlos Hitoshi Morimoto,et al.  AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys , 2016, CHI.

[53]  Patrick Baudisch,et al.  Snap-and-go: helping users align objects without the modality of traditional snapping , 2005, CHI.

[54]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[55]  I. Scott MacKenzie,et al.  Effects of feedback and dwell time on eye typing speed and accuracy , 2006, Universal Access in the Information Society.

[56]  Boris M. Velichkovsky,et al.  Influences of dwell time and cursor control on the performance in gaze driven typing , 2008 .

[57]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[58]  Hyeon-Jeong Suk,et al.  Thoughts and Tools for Crafting Colors: Implications from Designers' Behavior , 2017, Conference on Designing Interactive Systems.

[59]  Chris Creed Assistive tools for disability arts: collaborative experiences in working with disabled artists and stakeholders , 2016 .

[60]  Benjamin Strobel,et al.  Do graph readers prefer the graph type most suited to a given task? Insights from eye tracking , 2017 .

[61]  Peter Olivieri,et al.  EagleEyes: An Eye Control System for Persons with Disabilities , 2013 .

[62]  I. Scott MacKenzie,et al.  Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.

[63]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[64]  Andrew D. Wilson,et al.  Autopager: exploiting change blindness for gaze-assisted reading , 2018, ETRA.

[65]  Tanya Beelders,et al.  The Usability of Speech and Eye Gaze as a Multimodal Interface for a Word Processor , 2011 .

[66]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[67]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.