The Unadorned Desk: Exploiting the Physical Space around a Display as an Input Canvas

In everyday office work, people smoothly use the space on their physical desks to work with documents of interest, and to keep tools and materials nearby for easy use. In contrast, the limited screen space of computer displays imposes interface constraints. Associated material is placed off-screen (i.e., temporarily hidden) and requires extra work to access (window switching, menu selection) or crowds and competes with the work area (e.g., palettes and icons). This problem is worsened by the increasing popularity of small displays such as tablets and laptops. To mitigate this problem, we investigate how we can exploit an unadorned physical desk space as an additional input canvas. With minimal augmentation, our Unadorned Desk detects coarse hovering over and touching of discrete areas (‘items’) within a given area on an otherwise regular desk, which is used as input to the desktop computer. We hypothesize that people’s spatial memory will let them touch particular desk locations without looking. In contrast to other augmented desks, our system provides optional feedback of touches directly on the computer’s screen. We conducted a user study to understand how people make use of this input space. Participants freely placed and retrieved items onto/from the desk. We found that participants organize items in a grid-like fashion for easier access later on. In a second experiment, participants had to retrieve items from a predefined grid. When only few (large) items are located in the area, participants were faster without feedback and there was (surprisingly) no difference in error rates with or without feedback. As the item number grew (i.e., items shrank to fit the area), participants increasingly relied on feedback to minimize errors – at the cost of speed.

[1]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[2]  Jan Borchers,et al.  BendDesk: Seamless Integration of Horizontal and Vertical Multi-Touch Surfaces in Desk Environments , 2010 .

[3]  Ravin Balakrishnan,et al.  The role of kinesthetic reference frames in two-handed input performance , 1999, UIST '99.

[4]  Jan O. Borchers,et al.  The BendDesk demo: multi-touch on a curved display , 2010, ITS '10.

[5]  David M. Lane,et al.  Hidden Costs of Graphical User Interfaces: Failure to Make the Transition from Menus and Icon Toolbars to Keyboard Shortcuts , 2005, Int. J. Hum. Comput. Interact..

[6]  Anton Nijholt,et al.  Design for the periphery , 2010 .

[7]  Xiaojun Bi,et al.  Magic desk: bringing multi-touch surfaces into desktop work , 2011, CHI.

[8]  Andreas Butz,et al.  Comparing modalities and feedback for peripheral interaction , 2013, CHI Extended Abstracts.

[9]  Andreas Butz,et al.  StaTube: facilitating state management in instant messaging systems , 2012, Tangible and Embedded Interaction.

[10]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[11]  L. Kaufman,et al.  Handbook of perception and human performance , 1986 .

[12]  Olha Bondarenko,et al.  Documents at Hand: Learning from Paper to Improve Digital Technologies , 2005, CHI.

[13]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[14]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[15]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[16]  Andreas Butz,et al.  Exploring Design and Combination of Ambient Information and Peripheral Interaction , 2012 .

[17]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[18]  Bing-Yu Chen,et al.  Pub - point upon body: exploring eyes-free interaction and methods on an arm , 2011, UIST.

[19]  Saul Greenberg,et al.  The Computer User as Toolsmith: The Use, Reuse and Organization of Computer-Based Tools , 1993 .

[20]  Paul Dourish,et al.  Re-place-ing space: the roles of place and space in collaborative systems , 1996, CSCW '96.

[21]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[22]  Gerd Kortuem,et al.  Spin&Swing : Spatial Interaction with Orientation Aware Devices , 2010 .

[23]  Alan F. Blackwell,et al.  Peripheral tangible interaction by analytic design , 2009, Tangible and Embedded Interaction.

[24]  Elise van den Hoven,et al.  Exploring peripheral interaction design for primary school teachers , 2012, TEI.

[25]  M. Sile O'Modhrain,et al.  BodySpace: inferring body pose for natural control of a music player , 2007, CHI Extended Abstracts.

[26]  C. Wickens,et al.  Applied Attention Theory , 2007 .

[27]  Andreas Butz,et al.  Curve: revisiting the digital desk , 2010, NordiCHI.

[28]  Khai N. Truong,et al.  Virtual shelves: interactions with orientation aware devices , 2009, UIST '09.

[29]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[30]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[31]  Mary Czerwinski,et al.  Data mountain: using spatial memory for document management , 1998, UIST '98.

[32]  Thomas W. Malone How do people organize their desks? (Extended Abstract): Implications for the design of office information systems , 1982 .

[33]  Nicolai Marquardt,et al.  Extending a mobile device's interaction space through body-centric interaction , 2012, Mobile HCI.

[34]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[35]  Thomas W. Malone,et al.  How do people organize their desks?: Implications for the design of office information systems , 1983, TOIS.