3D remote interface for smart displays

The paper presents a novel user interface combining bare hands and the line of sight (LoS) by using a depth camera from far distance without any handheld devices; as well as a 3D GUI providing both stereoscopy and motion parallax for smart displays. The proposed user interface provides a precise and convenient manipulation which is applicable to browsing thousands of channels andor media files. Especially, the combined interaction methods of the two modalities achieve 120(x) × 70(y) × 5(z) manipulation resolution. And then various user tasks were performed so as to assess the proposed user interface.

[1]  Joaquim A. Jorge,et al.  A comparative study of interaction metaphors for large-scale displays , 2009, CHI Extended Abstracts.

[2]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[3]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[4]  Kellogg S. Booth,et al.  Shadow Reaching : A New Perspective on Interaction for Large Wall Displays , 2007 .

[5]  Jesper Tegnér,et al.  Mechanism for top-down control of working memory capacity , 2009, Proceedings of the National Academy of Sciences.

[6]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[7]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[8]  I. Scott MacKenzie,et al.  Extending Fitts' law to two-dimensional tasks , 1992, CHI.

[9]  Anthony Tang,et al.  Shadow reaching: a new perspective on interaction for large displays , 2007, UIST.

[10]  Hongbin Zha,et al.  Modeling dwell-based eye pointing target acquisition , 2010, CHI.

[11]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.

[12]  Joseph J. LaViola,et al.  Initial explorations into the user experience of 3D file browsing , 2009 .

[13]  Hrvoje Benko,et al.  Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface , 2008 .

[14]  Joseph J. LaViola,et al.  Initial explorations into the user experience of 3D file browsing , 2009, BCS HCI.

[15]  Tobias Isenberg,et al.  Presenting using two-handed interaction in open space , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[16]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[17]  Jocelyn Faubert,et al.  Motion parallax, stereoscopy, and the perception of depth: practical and theoretical issues , 2001, Optics East.

[18]  Du-Sik Park,et al.  3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.

[19]  Steven K. Feiner,et al.  Multi-monitor mouse , 2005, CHI EA '05.

[20]  Chaomei Chen,et al.  Information Visualization: Beyond the Horizon , 2006 .

[21]  Tovi Grossman,et al.  Pointing at trivariate targets in 3D environments , 2004, CHI.

[22]  Joris IJsselmuiden,et al.  Extending touch: towards interaction with large-scale surfaces , 2009, ITS '09.

[23]  Xiaojun Bi,et al.  Comparing usage of a large high-resolution display to single or dual desktop displays for daily work , 2009, CHI.