Un espace de classification pour l'interaction sur dispositifs mobiles

The multiplication and the diversity of mobile devices in everyday life involve new challenges for the designer: the size of the screen, text input, pointing, and the management of services in situation of mobility constitute new constraints. Many interaction techniques have been proposed these last years to answer theses problems. In this article, we describe a space for classifying the existing interaction techniques that have been proposed for mobile devices.

[1]  Vincent Hayward,et al.  A role for haptics in mobile interaction: initial design using a handheld tactile display prototype , 2006, CHI.

[2]  William Buxton,et al.  Issues in combining marking and direct manipulation techniques , 1991, UIST '91.

[3]  Albrecht Schmidt,et al.  Advanced Interaction in Context , 1999, HUC.

[4]  Michael K. Reiter,et al.  Seeing-is-believing: using camera phones for human-verifiable authentication , 2005, 2005 IEEE Symposium on Security and Privacy (S&P'05).

[5]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[6]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[7]  Stéphanie Minel,et al.  TactiMod dirige et oriente un piéton , 2006 .

[8]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[9]  Ken Perlin,et al.  Quikwriting: continuous stylus-based text entry , 1998, UIST '98.

[10]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[11]  Gary Marsden,et al.  Using treemaps to visualize threaded discussion forums on PDAs , 2005, CHI EA '05.

[12]  Amal Sirisena Mobile Text Entry , 2002 .

[13]  K E Barner,et al.  Design of a haptic data visualization system for people with visual impairments. , 1999, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[14]  Roger B. Dannenberg,et al.  An introduction to query by humming with the VocalSearch system , 2006 .

[15]  Topi Kaaresoja,et al.  Novel, minimalist haptic gesture interaction for mobile devices , 2004, NordiCHI '04.

[16]  Jun Rekimoto,et al.  SmartPad: a finger-sensing keypad for mobile interaction , 2003, CHI Extended Abstracts.

[17]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[18]  Christophe d'Alessandro,et al.  33 ans de synthèse de la parole à partir du texte : une promenade sonore (1968-2001) , 2001 .

[19]  Thomas P. Moran,et al.  Embodied User Interfaces: Towards Invisible User Interfaces , 1998, EHCI.

[20]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[21]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[22]  Benjamin B. Bederson,et al.  AppLens and launchTile: two designs for one-handed thumb use on small devices , 2005, CHI.

[23]  Igor Schadle,et al.  État de l'art des méthodes de saisie de données sur dispositifs nomades: typologie des approches , 2004, IHM '04.

[24]  Daniel A. Keim,et al.  Information Visualization and Visual Data Mining , 2002, IEEE Trans. Vis. Comput. Graph..

[25]  Julien Kahn,et al.  Etude empirique de l'interaction multimodale en mobilité: approche méthodologique et premiers résultats , 2005, IHM '05.

[26]  Mounia Ziat Conception et implémentation d'une fonction zoom haptique sur PDAs : Expérimentations et usages , 2006 .

[27]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[28]  Jean-Marc Robert,et al.  État de l'art des techniques de présentation d'information sur écran d'assistant numérique personnel , 2006, IHM '06.

[29]  Jarmo Hiipakka,et al.  A SPATIAL AUDIO USER INTERFACE FOR GENERATING MUSIC PLAYLISTS , 2003 .

[30]  B. Shneiderman,et al.  Improving the accuracy of touch screens: an experimental evaluation of three strategies , 1988, CHI '88.

[31]  Eric Lecolinet,et al.  Interfaces zoomables et Control menus , 2002 .

[32]  Mary Czerwinski,et al.  A fisheye calendar interface for PDAs: providing overviews for small displays , 2003, CHI Extended Abstracts.

[33]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[34]  D. Norman,et al.  User Centered System Design: New Perspectives on Human-Computer Interaction , 1988 .

[35]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[36]  M. Beaudouin-Lafon Ceci n'est pas un ordinateur : Perspectives sur l'interaction homme-machine , 2000 .

[37]  Mickaël Causse,et al.  Quel impact de l'entrée vocale sur la conception graphique d'un service mobile ? , 2005, IHM '05.

[38]  Eric Horvitz,et al.  ZoneZoom: map navigation for smartphones with recursive view segmentation , 2004, AVI.

[39]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[40]  Franck Poirier,et al.  Saisie de données pour interfaces réduites avec Glyph: principes, niveaux de saisie et évaluations théoriques , 2005, IHM '05.

[41]  Terry Winograd,et al.  FlowMenu: combining command, text, and data entry , 2000, UIST '00.

[42]  Duc Nguyen,et al.  Représentation focus+contexte de listes hiérarchiques zoomables , 2006, IHM '06.

[43]  Benjamin B. Bederson,et al.  Pocket PhotoMesa: a Zoomable image browser for PDAs , 2004, MUM '04.

[44]  Don Hopkins,et al.  The design and implementation of pie menus , 1991 .

[45]  William Buxton,et al.  Tracking menus , 2003, UIST '03.

[46]  Kori Inkpen Quinn,et al.  Just point and click?: using handhelds to interact with paper maps , 2005, Mobile HCI.

[47]  Enjámin,et al.  Pad ++ : A Zoomable Graphical Sketchpad For Exploring Alternate Interface Physics , 1996 .

[48]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[49]  Brad A. Myers,et al.  Few-key text entry revisited: mnemonic gestures on four keys , 2006, CHI.

[50]  Xiang Cao,et al.  Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.

[51]  Stuart Pook,et al.  Interaction et contexte dans les interfaces zoomables , 2001 .

[52]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[53]  Roger B. Dannenberg,et al.  Query by humming with the VocalSearch system , 2006, CACM.

[54]  Ben Shneiderman,et al.  Tree visualization with tree-maps: 2-d space-filling approach , 1992, TOGS.

[55]  William Buxton,et al.  A taxonomy of see-through tools , 1994, CHI '94.

[56]  Colin Ware Multimedia output devices and techniques , 1996, CSUR.

[57]  Patrick Baudisch,et al.  Halo: a Technique for Visualizing Off-Screen Locations , 2003 .

[58]  Guanling Chen,et al.  A Survey of Context-Aware Mobile Computing Research , 2000 .

[59]  Katherine J. Kuchenbecker,et al.  Improving contact realism through event-based haptic feedback , 2006, IEEE Transactions on Visualization and Computer Graphics.

[60]  Henry Lieberman,et al.  A multi-scale, multi-layer, translucent virtual space , 1997, Proceedings. 1997 IEEE Conference on Information Visualization (Cat. No.97TB100165).

[61]  Stéphane Huot,et al.  Focus+Context Visualization Techniques for Displaying Large Lists with Multiple Points of Interest on Small Tactile Screens , 2007, INTERACT.

[62]  Mary Czerwinski,et al.  FaThumb: a facet-based interface for mobile search , 2006, CHI.