Users' quest for an optimized representation of a multi-device space

A plethora of reaching techniques, intended for moving objects between locations distant to the user, have recently been proposed and tested. One of the most promising techniques is the Radar View. Up till now, the focus has been mostly on how a user can interact efficiently with a given radar map, not on how these maps are created and maintained. It is for instance unclear whether or not users would appreciate the possibility of adapting such radar maps to particular tasks and personal preferences. In this paper we address this question by means of a prolonged user study with the Sketch Radar prototype. The study demonstrates that users do indeed modify the default maps in order to improve interactions for particular tasks. It also provides insights into how and why the default physical map is modified.

[1]  Brian P. Bailey,et al.  ARIS: An Interface for Application Relocation in an Interactive Space , 2004, Graphics Interface.

[2]  Jun Rekimoto,et al.  InfoPoint: A Device that Provides a Uniform User Interface to Allow Appliances to Work Together over a Network , 2001, Personal and Ubiquitous Computing.

[3]  Kori Inkpen Quinn,et al.  That one there! Pointing to establish device identity , 2002, UIST '02.

[4]  Duncan Rowland,et al.  Interweaving mobile games with everyday life , 2006, CHI.

[5]  Mary Czerwinski,et al.  Data mountain: using spatial memory for document management , 1998, UIST '98.

[6]  Mary Czerwinski,et al.  The Contribution of Thumbnail Image, Mouse-over Text and Spatial Location Memory to Web Page Retrieval in 3D , 1999, INTERACT.

[7]  Carl Gutwin,et al.  A comparison of techniques for multi-display reaching , 2005, CHI.

[8]  Susan T. Dumais,et al.  The spatial metaphor for user interfaces: experimental tests of reference by location versus name , 1986, TOIS.

[9]  Jun Rekimoto,et al.  SyncTap: An Interaction Technique for Mobile Networking , 2003, Mobile HCI.

[10]  R. L. Deininger,et al.  S-R compatibility: correspondence among paired elements within stimulus and response codes. , 1954, Journal of experimental psychology.

[11]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[12]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[13]  Mountaz Hascoët,et al.  Throwing Models for Large Displays , 2008 .

[14]  Brian P. Bailey,et al.  A Toolset for Creating Iconic Interfaces for Interactive Workspaces , 2005, INTERACT.

[15]  Patrick Baudisch,et al.  Improving drag-and-drop on wall-size displays , 2005, Graphics Interface.

[16]  Jun Rekimoto,et al.  Proximal Interactions: A Direct Manipulation Technique for Wireless Networking , 2003, INTERACT.

[17]  Jeffrey Nichols,et al.  Studying the use of handhelds to control smart appliances , 2003, 23rd International Conference on Distributed Computing Systems Workshops, 2003. Proceedings..

[18]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[19]  Jeffrey Nichols,et al.  Interacting at a Distance Using Semantic Snarfing , 2001, UbiComp.

[20]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.