Revisiting peephole pointing: a study of target acquisition with a handheld projector

Peephole pointing is a promising interaction technique for large workspaces that contain more information than can be appropriately displayed on a single screen. In peephole pointing a window to the virtual workspace is moved in space to reveal additional content. In 2008, two different models for peephole pointing were discussed. Cao, Li and Balakrishnan proposed a two-component model, whereas Rohs and Oulasvirta investigated a similar model, but concluded that Fitts' law is sufficient for predicting peephole pointing performance. We present a user study performed with a handheld projector showing that Cao et al.'s model only outperforms Fitts' law in prediction accuracy when different peephole sizes are used and users have no prior knowledge of target location. Nevertheless, Fitts' law succeeds under the conditions most likely to occur. Additionally, we show that target overshooting is a key characteristic of peephole pointing and present the implementation of an orientation aware handheld projector that enables peephole interaction without instrumenting the environment.

[1]  Xiang Cao,et al.  Flashlight jigsaw: an exploratory study of an ad-hoc multi-player game on public displays , 2008, CSCW.

[2]  Johannes Schöning,et al.  Map navigation with mobile devices: virtual versus physical movement with and without visual context , 2007, ICMI '07.

[3]  I.,et al.  Fitts' Law as a Research and Design Tool in Human-Computer Interaction , 1992, Hum. Comput. Interact..

[4]  Wolfgang Stuerzlinger,et al.  Laser Pointers as Collaborative Pointing Devices , 2002, Graphics Interface.

[5]  Paul A. Beardsley,et al.  Interaction using a handheld projector , 2005, IEEE Computer Graphics and Applications.

[6]  Xing Chen,et al.  Lumipoint: multi-user laser-based interaction on large tiled displays , 2002 .

[7]  Paul A. Beardsley,et al.  Zoom-and-pick: facilitating visual zooming and precision pointing with interactive handheld projectors , 2005, UIST.

[8]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[9]  Masanori Sugimoto,et al.  A semi-automatic realtime calibration technique for a handheld projector , 2007, VRST '07.

[10]  Michael Rohs,et al.  Target acquisition with camera phones when used as magic lenses , 2008, CHI.

[11]  Paul A. Beardsley,et al.  RFIG lamps: interacting with a self-describing world via photosensing wireless tags and projectors , 2004, ACM Trans. Graph..

[12]  Steven K. Feiner,et al.  Exploring interaction with a simulated wrist-worn projection display , 2005, Ninth IEEE International Symposium on Wearable Computers (ISWC'05).

[13]  Ravin Balakrishnan,et al.  Acquisition of expanding targets , 2002, CHI.

[14]  Martin C. Emele,et al.  SPOTLIGHT NAVIGATION: INTERACTION WITH A HANDHELD PROJECTION DEVICE , 2004 .

[15]  Xiang Cao,et al.  Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.

[16]  Ricardo Tesoriero,et al.  Distributed User Interfaces: Designing Interfaces for the Distributed Ecosystem , 2011 .

[17]  Stefan Rapp Spotlight Navigation: a pioneering user interface for mobile projection , 2010 .

[18]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[19]  Hans-Werner Gellersen,et al.  Personal Projectors for Pervasive Computing , 2012, IEEE Pervasive Computing.

[20]  Carl Gutwin,et al.  Wedge: clutter-free visualization of off-screen locations , 2008, CHI.

[21]  Xiang Cao,et al.  Peephole pointing: modeling acquisition of dynamically revealed targets , 2008, CHI.

[22]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[23]  Andriy Pavlovych,et al.  The tradeoff between spatial jitter and latency in pointing tasks , 2009, EICS '09.

[24]  W. Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[25]  Michael Rohs,et al.  Interaction with magic lenses: real-world validation of a Fitts' Law model , 2011, CHI.

[26]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[27]  Benjamin B. Bederson,et al.  A review of overview+detail, zooming, and focus+context interfaces , 2009, CSUR.

[28]  Niels Henze,et al.  Evaluation of an off-screen visualization for magic lens and dynamic peephole interfaces , 2010, Mobile HCI.

[29]  Antonio Krüger,et al.  Distributed User Interfaces for Projector Phones , 2011, Distributed User Interfaces.

[30]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[31]  Lars Erik Holmquist,et al.  Ubiquitous graphics: combining hand-held and wall-size displays to interact with large images , 2006, AVI '06.

[32]  David Ahlström,et al.  Modeling and improving selection in cascading pull-down menus using Fitts' law, the steering law and force fields , 2005, CHI.

[33]  Xiang Cao,et al.  Multi-user interaction using handheld projectors , 2007, UIST.