Enabling mobile microinteractions

While much attention has been paid to the usability of desktop computers, mobile computers are quickly becoming the dominant platform. Because mobile computers may be used in nearly any situation—including while the user is actually in motion, or performing other tasks—interfaces designed for stationary use may be inappropriate, and alternative interfaces should be considered. In this dissertation I consider the idea of microinteractions —interactions with a device that take less than four seconds to initiate and complete. Microinteractions are desirable because they may minimize interruption; that is, they allow for a tiny burst of interaction with a device so that the user can quickly return to the task at hand. My research concentrates on methods for applying microinteractions through wrist-based interaction. I consider two modalities for this interaction: touchscreens and motion-based gestures. In the case of touchscreens, I consider the interface implications of making touchscreen watches usable with the finger, instead of the usual stylus, and investigate users' performance with a round touchscreen. For gesture-based interaction, I present a tool, MAGIC, for designing gesture-based interactive system, and detail the evaluation of the tool.

[1]  Ken Perlin,et al.  Quikwriting: continuous stylus-based text entry , 1998, UIST '98.

[2]  James A. Landay,et al.  Examining Difficulties Software Developers Encounter in the Adoption of Statistical Machine Learning , 2008, AAAI.

[3]  Thad Starner,et al.  Use of mobile appointment scheduling devices , 2004, CHI EA '04.

[4]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[5]  Pierre Dragicevic,et al.  Earpod: eyes-free menu selection using touch input and reactive audio feedback , 2007, CHI.

[6]  Steinar Kristoffersen,et al.  “Making place” to make IT work: empirical explorations of HCI for mobile CSCW , 1999, GROUP.

[7]  Robert B. Miller,et al.  Response time in man-computer conversational transactions , 1899, AFIPS Fall Joint Computing Conference.

[8]  Kent Lyons,et al.  GART: The Gesture and Activity Recognition Toolkit , 2007, HCI.

[9]  Micah Alpern,et al.  Developing a car gesture interface for use as a secondary task , 2003, CHI Extended Abstracts.

[10]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[11]  Ian Oakley,et al.  A motion-based marking menu system , 2007, CHI Extended Abstracts.

[12]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[13]  Kent Lyons,et al.  The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[14]  Palmer Morrel-Samuels,et al.  Clarifying the Distinction Between Lexical and Gestural Commands , 1990, Int. J. Man Mach. Stud..

[15]  Roger B. Dannenberg,et al.  A gesture based user interface prototyping system , 1989, UIST '89.

[16]  I. Scott MacKenzie,et al.  An error model for pointing based on Fitts' law , 2008, CHI.

[17]  Lars Bretzner,et al.  Using marking menus to develop command sets for computer vision based hand gesture interfaces , 2002, NordiCHI '02.

[18]  Albrecht Schmidt,et al.  Developing User Interfaces for Wearable Computers: Don't Stop to Point and Click , 2000 .

[19]  Antti Oulasvirta The fragmentation of attention in mobile interaction, and what to do with it , 2005, INTR.

[20]  Kent Lyons,et al.  Quickdraw: the impact of mobility and on-body placement on device access time , 2008, CHI.

[21]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[22]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[23]  Eamonn J. Keogh,et al.  Exact Discovery of Time Series Motifs , 2009, SDM.

[24]  Anoop K. Sinha,et al.  Suede: a Wizard of Oz prototyping tool for speech user interfaces , 2000, UIST '00.

[25]  Fumiko Ichikawa,et al.  A Cross Culture Study on Phone Carrying and Physical Personalization , 2007, HCI.

[26]  Jani Mäntyjärvi,et al.  Enabling fast and effortless customisation in accelerometer based gesture interaction , 2004, MUM '04.

[27]  Patrick Baudisch,et al.  Back-of-device interaction allows creating very small touch devices , 2009, CHI.

[28]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[29]  Edoardo Ardizzone,et al.  An architecture for automatic gesture analysis , 2000, AVI '00.

[30]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[31]  Jon Froehlich,et al.  Barrier pointing: using physical edges to assist target acquisition on mobile device touch screens , 2007, Assets '07.

[32]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[33]  M. Weiser,et al.  An empirical comparison of pie vs. linear menus , 1988, CHI '88.

[34]  Joseph J. LaViola,et al.  Hands-free multi-scale navigation in virtual environments , 2001, I3D '01.

[35]  Eamonn J. Keogh,et al.  Scaling and time warping in time series querying , 2005, The VLDB Journal.

[36]  Jerry Alan Fails,et al.  A design tool for camera-based interaction , 2003, CHI '03.

[37]  Gregory D. Abowd,et al.  Viz-A-Vis: Toward Visualizing Video through Computer Vision , 2008, IEEE Transactions on Visualization and Computer Graphics.

[38]  Per Persson,et al.  Mobile essentials: field study and concepting , 2005, DUX '05.

[39]  M. Sile O'Modhrain,et al.  BodySpace: inferring body pose for natural control of a music player , 2007, CHI Extended Abstracts.

[40]  Desney S. Tan,et al.  EnsembleMatrix: interactive visualization to support machine learning with multiple classifiers , 2009, CHI.

[41]  Kent Lyons,et al.  An investigation into round touchscreen wristwatch interaction , 2008, Mobile HCI.

[42]  Patrick Baudisch,et al.  Curve dial: eyes-free parameter entry for GUIs , 2005, CHI Extended Abstracts.

[43]  James A. Landay,et al.  Quill: a gesture design tool for pen-based user interfaces , 2001 .

[44]  Paul Anderson,et al.  Gameplay issues in the design of spatial 3D gestures for video games. , 2006, CHI EA '06.

[45]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[46]  Terry Winograd,et al.  FlowMenu: combining command, text, and data entry , 2000, UIST '00.

[47]  Toni Giorgino,et al.  Matching incomplete time series with dynamic time warping: an algorithm and an application to post-stroke rehabilitation , 2009, Artif. Intell. Medicine.

[48]  Peter Phillips,et al.  The conductor interaction method , 2007, TOMCCAP.

[49]  Mohan Kumar,et al.  Using dynamic time warping for online temporal fusion in multisensor systems , 2008, Inf. Fusion.

[50]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[51]  C. Cadoz,et al.  Les réalités virtuelles : un exposé pour comprendre, un essai pour réfléchir , 1994 .

[52]  Jakub Segen,et al.  Gesture VR: vision-based 3D hand interace for spatial interaction , 1998, MULTIMEDIA '98.

[53]  Howell O. Istance,et al.  Snap clutch, a moded approach to solving the Midas touch problem , 2008, ETRA.

[54]  Lorna M. Brown,et al.  Exploring the potential of audio-tactile messaging for remote interpersonal communication , 2009, CHI.

[55]  Monica M. C. Schraefel,et al.  The radial scroll tool: scrolling support for stylus- or touch-based document navigation , 2004, UIST '04.

[56]  Tracy L. Westeyn,et al.  Recognizing song-based blink patterns: applications for restricted and universal access , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[57]  John F. Hughes,et al.  Navigating documents with the virtual scroll ring , 2004, UIST '04.

[58]  Steven K. Feiner,et al.  Evaluation of an Eyes-Free Cursorless Numeric Entry System for Wearable Computers , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[59]  Jonna Häkkilä,et al.  Tap input as an embedded interaction method for mobile devices , 2007, TEI.

[60]  Zoltán Prekopcsák,et al.  Accelerometer Based Real-Time Gesture Recognition , 2008 .

[61]  Alan Wexelblat,et al.  An approach to natural gesture in virtual environments , 1995, TCHI.

[62]  Randy F. Pausch,et al.  Tailor: creating custom user interfaces based on gesture , 1990, UIST '90.

[63]  Alexander G. Hauptmann,et al.  Speech and gestures for graphic image manipulation , 1989, CHI '89.

[64]  Takeo Igarashi,et al.  Eyepatch: prototyping camera-based interaction through examples , 2007, UIST '07.

[65]  Anind K. Dey,et al.  a CAPpella: programming by demonstration of context-aware applications , 2004, CHI.

[66]  James A. Landay,et al.  "Those look similar!" issues in automating gesture design advice , 2001, PUI '01.

[67]  Maribeth Gandy,et al.  Experiments In Interaction Between Wearable and Environmental Infrastructure Using the Gesture Pendant , 2001 .

[68]  Peter Robinson,et al.  The use of gestures in multimodal input , 1998, Assets '98.

[69]  James A. Landay,et al.  Implications for a gesture design tool , 1999, CHI '99.

[70]  Ehud Sharlin,et al.  Exploring the use of tangible user interfaces for human-robot interaction: a comparative study , 2008, CHI.

[71]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[72]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[73]  William Ribarsky,et al.  Speech and Gesture Multimodal Control of a Whole Earth 3D Visualization Environment , 2002, VisSym.

[74]  Jeanette Blomberg,et al.  PARTICIPATORY DESIGN : Principles and Practices 7 Ethnographic Field Methodsand Their Relation to Design 1993 , .

[75]  Rafael Ballagas,et al.  Unravelling seams: improving mobile gesture recognition with visual feedback techniques , 2009, CHI.

[76]  Gregory D. Abowd,et al.  Cirrin: a word-level unistroke keyboard for pen input , 1998, UIST '98.

[77]  Panu Korpipää,et al.  ActionCube: a tangible mobile gesture interaction tutorial , 2008, Tangible and Embedded Interaction.

[78]  Steven Feiner,et al.  Cursorless interaction techniques for wearable and mobile computing , 2007 .

[79]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[80]  Kent Lyons,et al.  Reading on-the-go: a comparison of audio and hand-held displays , 2006, Mobile HCI.

[81]  Elisa Rubegni,et al.  Wi-roni: a gesture tangible interface for experiencing internet content in public spaces , 2007, TMR '07.

[82]  H. Zelaznik,et al.  Motor-output variability: a theory for the accuracy of rapid motor acts. , 1979, Psychological review.

[83]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[84]  Tomoichi Takahashi,et al.  Hand gesture coding based on experiments using a hand gesture interface device , 1991, SGCH.

[85]  An empirical study of speech and gesture interaction: toward the definition of ergonomic design guidelines , 1998, CHI Conference Summary.

[86]  Jakob Nielsen,et al.  Iterative user-interface design , 1993, Computer.

[87]  Pattie Maes,et al.  Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures , 2007, CHI.

[88]  Alan Wexelblat Gesture at the user interface: a CHI '95 workshop , 1996, SGCH.

[89]  Thad Starner,et al.  American sign language recognition in game development for deaf children , 2006, Assets '06.

[90]  Tracy L. Westeyn,et al.  Georgia tech gesture toolkit: supporting experiments in gesture recognition , 2003, ICMI '03.

[91]  Kent Lyons,et al.  Augmenting conversations using dual-purpose speech , 2004, UIST '04.

[92]  V. Kepuska,et al.  A novel Wake-Up-Word speech recognition system, Wake-Up-Word recognition task, technology and evaluation , 2009 .

[93]  Monica M. C. Schraefel,et al.  Investigating user tolerance for errors in vision-enabled gesture-based interactions , 2006, AVI '06.

[94]  Johan Plomp,et al.  Visualization of hand gestures for pervasive computing environments , 2006, AVI '06.

[95]  Brad A. Myers,et al.  Past, Present and Future of User Interface Software Tools , 2000, TCHI.

[96]  Manolya Kavakli,et al.  Designing in virtual reality (DesIRe): a gesture-based interface , 2007, DIMEA.

[97]  KwangYun Wohn,et al.  The control of avatar motion using hand gesture , 1998, VRST '98.

[98]  Kent L. Norman,et al.  Development of an instrument measuring user satisfaction of the human-computer interface , 1988, CHI '88.

[99]  Daqing Zhang,et al.  Gesture Recognition with a 3-D Accelerometer , 2009, UIC.

[100]  Gregory D. Abowd,et al.  Farther Than You May Think: An Empirical Investigation of the Proximity of Users to Their Mobile Phones , 2006, UbiComp.

[101]  Mandayam T. Raghunath,et al.  User Interfaces for Applications on a Wrist Watch , 2002, Personal and Ubiquitous Computing.

[102]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[103]  Scott R. Klemmer,et al.  Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.

[104]  Thad Starner,et al.  Hambone: A Bio-Acoustic Gesture Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.