Around device interaction for multiscale navigation

In this paper we study the design space of free-space interactions for multiscale navigation afforded by mobile depth sensors. Such interactions will have a greater working volume, more fluid control and avoid screen occlusion effects intrinsic to touch screens. This work contributes the first study to show that mobile free-space interactions can be as good as touch. We also analyze sensor orientation and interaction volume usage, resulting in strong implications for how sensors should be placed on mobile devices. We describe a user study evaluating mobile free-space navigation techniques and the impacts of sensor orientation on user experience. Finally, we discuss guidelines for future mobile free-space interaction techniques and sensor design.

[1]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[2]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[3]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time dynamic 3D surface reconstruction and interaction , 2011, SIGGRAPH '11.

[4]  Ken Hinckley,et al.  A survey of design issues in spatial input , 1994, UIST '94.

[5]  Patrick Baudisch,et al.  Halo: a Technique for Visualizing Off-Screen Locations , 2003 .

[6]  Shumin Zhai,et al.  Human Performance in Six Degree of Freedom Input Control , 2002 .

[7]  Ravin Balakrishnan,et al.  Zliding: fluid zooming and sliding for high precision parameter manipulation , 2005, UIST.

[8]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[9]  Michael Rohs,et al.  HoverFlow: expanding the design space of around-device interaction , 2009, Mobile HCI.

[10]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[11]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[12]  Carl Gutwin,et al.  Wedge: clutter-free visualization of off-screen locations , 2008, CHI.

[13]  Vivek K. Goyal,et al.  CoDAC: A compressive depth acquisition camera framework , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[14]  Robert J. K. Jacob,et al.  The perceptual structure of multidimensional input device selection , 1992, CHI.

[15]  Mark Billinghurst,et al.  The use of dense stereo range data in augmented reality , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[16]  Andy Cockburn,et al.  Characterizing user performance with assisted direct off-screen pointing , 2011, Mobile HCI.

[17]  Shumin Zhai,et al.  Camera phone based motion sensing: interaction techniques, applications and performance study , 2006, UIST.

[18]  Yves Guiard,et al.  Multiscale pointing: facilitating pan-zoom coordination , 2002, CHI Extended Abstracts.

[19]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[20]  Dieter Fox,et al.  Sparse distance learning for object recognition combining RGB and depth information , 2011, 2011 IEEE International Conference on Robotics and Automation.

[21]  Rod McCall,et al.  Lightweight palm and finger tracking for real-time 3D gesture control , 2011, 2011 IEEE Virtual Reality Conference.

[22]  Patrick Baudisch,et al.  Separability of spatial manipulations in multi-touch interfaces , 2009, Graphics Interface.

[23]  J. H. Schuenemeyer,et al.  A Modified Kolmogorov-Smirnov Test Sensitive to Tail Alternatives , 1983 .

[24]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[25]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[26]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[27]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[28]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[29]  Joachim Pouderoux,et al.  A camera-based interface for interaction with mobile handheld computers , 2005, I3D '05.

[30]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[31]  Shumin Zhai,et al.  The influence of muscle groups on performance of multiple degree-of-freedom input , 1996, CHI.

[32]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[33]  Michael Rohs,et al.  PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices , 2012, AVI.

[34]  Yves Guiard,et al.  Target acquisition in multiscale electronic worlds , 2004, Int. J. Hum. Comput. Stud..

[35]  Dennis Proffitt,et al.  Cooperative bimanual action , 1997, CHI.

[36]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[37]  Hamed Ketabdar,et al.  Towards using embedded magnetic field sensor for around mobile device 3D interaction , 2010, Mobile HCI.