Speech-filtered bubble ray: improving target acquisition on display walls

The rapid development of large interactive wall displays has been accompanied by research on methods that allow people to interact with the display at a distance. The basic method for target acquisition is by ray casting a cursor from one's pointing finger or hand position; the problem is that selection is slow and error-prone with small targets. A better method is the bubble cursor that resizes the cursor's activation area to effectively enlarge the target size. The catch is that this technique's effectiveness depends on the proximity of surrounding targets: while beneficial in sparse spaces, it is less so when targets are densely packed together. Our method is the speech-filtered bubble ray that uses speech to transform a dense target space into a sparse one. Our strategy builds on what people already do: people pointing to distant objects in a physical workspace typically disambiguate their choice through speech. For example, a person could point to a stack of books and say "the green one". Gesture indicates the approximate location for the search, and speech 'filters' unrelated books from the search. Our technique works the same way; a person specifies a property of the desired object, and only the location of objects matching that property trigger the bubble size. In a controlled evaluation, people were faster and preferred using the speech-filtered bubble ray over the standard bubble ray and ray casting approach.

[1]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[2]  Renaud Blanch,et al.  Semantic pointing: improving target acquisition with control-display ratio adaptation , 2004, CHI.

[3]  Steven K. Feiner,et al.  Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality , 2003, ICMI '03.

[4]  Regan L. Mandryk,et al.  TractorBeam Selection Aids: Improving Target Acquisition for Pointing Input on Tabletop Displays , 2005, INTERACT.

[5]  Saul Greenberg,et al.  How pairs interact over a multimodal digital table , 2007, CHI.

[6]  Ravin Balakrishnan,et al.  Fitts' law and expanding targets: Experimental studies and designs for user interfaces , 2005, TCHI.

[7]  Paul Kabbash,et al.  The “prince” technique: Fitts' law and selection using area cursors , 1995, CHI '95.

[8]  Sharon Oviatt,et al.  Multimodal interactive maps: designing for human performance , 1997 .

[9]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[10]  Saul Greenberg,et al.  Rapidly Prototyping Single Display Groupware through the SDGToolkit , 2004, AUIC.

[11]  Anastasia Bezerianos,et al.  The vacuum: facilitating the manipulation of distant objects , 2005, CHI.

[12]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[13]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[14]  I. Scott MacKenzie,et al.  Movement time prediction in human-computer interfaces , 1992 .

[15]  Mary Czerwinski,et al.  Drag-and-Pop and Drag-and-Pick: Techniques for Accessing Remote Screen Content on Touch- and Pen-Operated Systems , 2003, INTERACT.

[16]  Jeffrey Nichols,et al.  Interacting at a distance: measuring the performance of laser pointers and other devices , 2002, CHI.

[17]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[18]  I. MacKenzie,et al.  A note on the information-theoretic basis of Fitts' law. , 1989, Journal of motor behavior.

[19]  MacKenzie Is A Note on the Information-Theoretic Basis for Fitts’ Law , 1989 .