Hover Pad: interacting with autonomous and self-actuated displays in space

Handheld displays enable flexible spatial exploration of information spaces -- users can physically navigate through three-dimensional space to access information at specific locations. Having users constantly hold the display, however, has several limitations: (1) inaccuracies due to natural hand tremors; (2) fatigue over time; and (3) limited exploration within arm's reach. We investigate autonomous, self-actuated displays that can freely move and hold their position and orientation in space without users having to hold them at all times. We illustrate various stages of such a display's autonomy ranging from manual to fully autonomous, which -- depending on the tasks -- facilitate the interaction. Further, we discuss possible motion control mechanisms for these displays and present several interaction techniques enabled by such displays. Our Hover Pad toolkit enables exploring five degrees of freedom of self-actuated and autonomous displays and the developed control and interaction techniques. We illustrate the utility of our toolkit with five prototype applications, such as a volumetric medical data explorer.

[1]  Hiroshi Ishii,et al.  Mechanical constraints as computational constraints in tabletop tangible interfaces , 2007, CHI.

[2]  Jan O. Borchers,et al.  Madgets: actuating widgets on interactive tabletops , 2010, UIST.

[3]  Kasper Hornbæk,et al.  Tangible bots: interaction with active tangibles in tabletop interfaces , 2011, CHI.

[4]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[5]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[6]  Hiroshi Ishii,et al.  ZeroN: mid-air tangible interaction enabled by computer controlled magnetic levitation , 2011, UIST.

[7]  D. Maynes-Aminzade,et al.  The actuated workbench: computer-controlled actuation in tabletop tangible interfaces , 2003, ACM Trans. Graph..

[8]  Nicolai Marquardt,et al.  Extending a mobile device's interaction space through body-centric interaction , 2012, Mobile HCI.

[9]  Sriram Subramanian,et al.  Ultra-tangibles: creating movable tangible objects on interactive tables , 2012, CHI.

[10]  Raimund Dachselt,et al.  Going beyond the surface: studying multi-layer interaction above the tabletop , 2012, CHI.

[11]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[12]  Steven K. Feiner,et al.  Virtual projection: exploring optical projection as a metaphor for multi-device interaction , 2012, CHI.

[13]  Raimund Dachselt,et al.  Use your head: tangible windows for 3D information spaces in a tabletop environment , 2012, ITS.

[14]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[15]  Albrecht Schmidt,et al.  Midair displays: exploring the concept of free-floating public displays , 2014, CHI Extended Abstracts.

[16]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[17]  Mike Sinclair,et al.  TouchMover: actuated 3D touchscreen with haptic feedback , 2013, ITS.

[18]  Otmar Hilliges,et al.  Steerable augmented reality with the beamatron , 2012, UIST.

[19]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.

[20]  Dominik Schmidt,et al.  A cross-device interaction style for mobiles and surfaces , 2012, DIS '12.

[21]  Cynthia Breazeal,et al.  Experiments with a robotic computer: Body, affect and cognition interactions , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[22]  Ken Perlin,et al.  Physical objects as bidirectional user interface elements , 2004, IEEE Computer Graphics and Applications.

[23]  Heidrun Schumann,et al.  Tangible views for information visualization , 2010, ITS '10.

[24]  William Buxton,et al.  Graspable user interfaces , 1996 .

[25]  Daniel Jackson,et al.  Touchbugs: actuated tangibles on multi-touch tables , 2013, CHI.

[26]  W. Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[27]  Gary W. Meyer,et al.  A handheld flexible display system , 2005, VIS 05. IEEE Visualization, 2005..

[28]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[29]  Jan O. Borchers,et al.  Rendering physical effects in tabletop controls , 2011, CHI.

[30]  Pattie Maes,et al.  Flexpad: highly flexible bending interactions for projected handheld displays , 2013, CHI.

[31]  Marianne Graves Petersen,et al.  Aerial tunes: exploring interaction qualities of mid-air displays , 2012, NordiCHI.

[32]  Dan Rosenfeld,et al.  Going beyond the display: a surface technology with an electronically switchable diffuser , 2008, UIST '08.