Separability of spatial manipulations in multi-touch interfaces

Multi-touch interfaces allow users to translate, rotate, and scale digital objects in a single interaction. However, this freedom represents a problem when users intend to perform only a subset of manipulations. A user trying to scale an object in a print layout program, for example, might find that the object was also slightly translated and rotated, interfering with what was already carefully laid out earlier. We implemented and tested interaction techniques that allow users to select a subset of manipulations. Magnitude Filtering eliminates transformations (e.g., rotation) that are small in magnitude. Gesture Matching attempts to classify the user's input into a subset of manipulation gestures. Handles adopts a conventional single-touch handles approach for touch input. Our empirical study showed that these techniques significantly reduce errors in layout, while the Handles technique was slowest. A variation of the Gesture Matching technique presented the best combination of speed and control, and was favored by participants.

[1]  Paul Milgram,et al.  Measuring the allocation of control in a 6 degree-of-freedom docking experiment , 2000, CHI.

[2]  Jurriaan D. Mulder,et al.  Spatial input device structure and bimanual object manipulation in virtual environments , 2006, VRST '06.

[3]  Patrick Baudisch,et al.  Snap-and-go: helping users align objects without the modality of traditional snapping , 2005, CHI.

[4]  Christopher Joseph Pal,et al.  A two-ball mouse affords three degrees of freedom , 1997, CHI Extended Abstracts.

[5]  A. H. Mason,et al.  Coordination and concurrency in bimanual rotation tasks when moving away from and toward the body , 2007, Experimental Brain Research.

[6]  M. Sheelagh T. Carpendale,et al.  Fluid integration of rotation and translation , 2005, CHI.

[7]  Colin Ware Using hand position for virtual object placement , 2005, The Visual Computer.

[8]  Robin Williams The Non-Designer's Design Book , 1994 .

[9]  Robert J. K. Jacob,et al.  Integrality and separability of input devices , 1994, TCHI.

[10]  Charles L. A. Clarke,et al.  Bimanual and unimanual image alignment: an evaluation of mouse-based techniques , 2005, UIST '05.

[11]  Trent Apted,et al.  Tabletop sharing of digital photographs for the elderly , 2006, CHI.

[12]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[13]  Michel Beaudouin-Lafon,et al.  The architecture and implementation of CPN2000, a post-WIMP graphical application , 2000, UIST '00.

[14]  Roope Raisamo,et al.  A new direct manipulation technique for aligning objects in drawing programs , 1996, UIST '96.

[15]  Meredith Ringel Morris,et al.  DiamondSpin: an extensible toolkit for around-the-table interaction , 2004, CHI.

[16]  John F. Hughes,et al.  Multi-finger cursor techniques , 2006, Graphics Interface.

[17]  Yanqing Wang,et al.  The structure of object transportation and orientation in human-computer interaction , 1998, CHI.

[18]  M. Sheelagh T. Carpendale,et al.  Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques , 2007, CHI.

[19]  Maureen C. Stone,et al.  Snap-dragging , 1986, SIGGRAPH.

[20]  Ravin Balakrishnan,et al.  Symmetric bimanual interaction , 2000, CHI.

[21]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[22]  Robin Williams The non-designer's design book, third edition , 2008 .

[23]  John F. Hughes,et al.  Indirect mappings of multi-touch input using one and two hands , 2008, CHI.

[24]  Shumin Zhai,et al.  Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices , 1998, CHI.

[25]  M. Sheelagh T. Carpendale,et al.  Rotation and translation mechanisms for tabletop interaction , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[26]  Regan L. Mandryk,et al.  Sticky widgets: pseudo-haptic widget enhancements for multi-monitor displays , 2005, CHI Extended Abstracts.