Two-handed virtual manipulation

We discuss a two-handed user interface designed to support three-dimesional neurosurgical visualization. By itself, this system is a “point design,” an example of an advanced user interface technique. In this work, we argue that in order to understand why interaction techniques do or do not work, and to suggest possibilities for new techniques, it is important to move beyond point design and to introduce careful scientific measurement of human behavioral principles. In particular, we argue that the common-sense viewpoint that “two hands save time by working in parallel” may not always be an effective way to think about two-handed interface design because the hands do not necessarily work in parallel (there is a structure to two-handed manipulation) and because two hands do more than just save time over one hand (two hands provide the user with more information and can structure how the user thinks about a task). To support these claims, we present an interface design developed in collaboration with neurosurgeons which has undergone extensive informal usability testing, as well as a pair of formal experimental studies which investigate behavioral aspects of two-handed virtual object manipulation. Our hope is that this discussion will help others to apply the lessons in our neurosurgery application to future two-handed user interface designs.

[1]  Ken Hinckley,et al.  Haptic issues for virtual manipulation , 1997 .

[2]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[3]  Steven D. Pieper,et al.  Hands-on interaction with virtual environments , 1989, UIST '89.

[4]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[5]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[6]  F. P. Brooks,et al.  Grasping reality through illusion—interactive graphics serving science , 1988, CHI '88.

[7]  G. J. F. Smets,et al.  Telemanipulation and telepresence , 1995 .

[8]  Mark Green,et al.  JDCAD: A highly interactive 3D modeling system , 1994, Comput. Graph..

[9]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, International Conference on Human Factors in Computing Systems.

[10]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[11]  Dennis Proffitt,et al.  Attention and visual feedback: the bimanual frame of reference , 1997, SI3D.

[12]  J. Gibson Observations on active touch. , 1962, Psychological review.

[13]  John F. Hughes,et al.  Sculpting: an interactive volumetric modeling technique , 1991, SIGGRAPH.

[14]  Stuart Card,et al.  User technology—from pointing to pondering , 1986, HPW '86.

[15]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[16]  Paul Kabbash,et al.  Human performance using computer input devices in the preferred and non-preferred hands , 1993, INTERCHI.

[17]  Sara N. Davis Sex Differences in Cognitive Abilities , 1992 .

[18]  Andrew Roberts,et al.  3-Draw: a tool for designing 3D shapes , 1991, IEEE Computer Graphics and Applications.

[19]  J. Michael Moshell,et al.  A Two-Handed Interface for Object Manipulation in Virtual Environments , 1995, Presence: Teleoperators & Virtual Environments.

[20]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Alexander G. Hauptmann,et al.  Speech and gestures for graphic image manipulation , 1989, CHI '89.

[22]  D. Halpern Sex Differences in Cognitive Abilities , 1986 .

[23]  William Buxton,et al.  An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input , 1997, CHI.

[24]  P. Wolff,et al.  Serial organization of motor skills in left- and right-handed adults , 1977, Neuropsychologia.

[25]  Chris Shaw,et al.  Two-handed polygonal surface design , 1994, UIST '94.

[26]  Daniel Thalmann,et al.  Sculpting with the `ball and mouse' metaphor , 1991 .

[27]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[28]  H. G. Stassen,et al.  Telemanipulation and Telepresence , 1995 .

[29]  Stéphane Chatty,et al.  Extending a graphical toolkit for two-handed interaction , 1994, UIST '94.

[30]  PauschRandy,et al.  Two-Handed Spatial Interface Tools for Neurosurgical Planning , 1995 .

[31]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[32]  Shumin Zhai,et al.  The partial-occlusion effect: utilizing semitransparency in 3D human-computer interaction , 1996, TCHI.

[33]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[34]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[35]  M Cremer,et al.  Motor performance and concurrent cognitive tasks. , 1981, Journal of motor behavior.

[36]  Shumin Zhai,et al.  The influence of muscle groups on performance of multiple degree-of-freedom input , 1996, CHI.

[37]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[38]  M. Peters,et al.  Constraints in the performance of bimanual tasks and their expression in unskilled and skilled subjects , 1985 .

[39]  Shumin Zhai,et al.  Manual and cognitive benefits of two-handed input: an experimental study , 1998, TCHI.

[40]  Jakob Nielsen,et al.  Noncommand user interfaces , 1993, CACM.

[41]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[42]  J. Annett,et al.  The Control of Movement in the Preferred and Non-Preferred Hands* , 1979, The Quarterly journal of experimental psychology.

[43]  James S. Lipscomb,et al.  Making nested rotations convenient for the user , 1978, SIGGRAPH.

[44]  Michael Deering,et al.  High resolution virtual reality , 1992, SIGGRAPH.

[45]  Andrew S. Forsberg,et al.  Two pointer input for 3D interaction , 1997, SI3D.

[46]  H. Honda,et al.  Rightward Superiority of Eye Movements in a Bimanual Aiming Task , 1982, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[47]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[48]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[49]  James M. Hennessey,et al.  Comparing single- and two-handed 3D input for a 3D object assembly task , 1998, CHI Conference Summary.

[50]  A. Wing Timing and Co-Ordination of Repetitive Bimanual Movements , 1982 .

[51]  Dennis Proffitt,et al.  Cooperative bimanual action , 1997, CHI.

[52]  Tom G. Zimmerman,et al.  A hand gesture interface device , 1987, CHI '87.

[53]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[54]  Ken Hinckley,et al.  A survey of design issues in spatial input , 1994, UIST '94.

[55]  Robert J. K. Jacob,et al.  Integrality and separability of input devices , 1994, TCHI.

[56]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[57]  Joe Tullio,et al.  Usability analysis of 3D rotation techniques , 1997, UIST '97.

[58]  Yves Guiard Failure to sing the left-hand part of the score during piano performance , 1989 .

[59]  Luis Serra,et al.  Interactive vessel tracing in volume data , 1997, SI3D.

[60]  S. Kicha Ganapathy,et al.  A synthetic visual environment with hand gesturing and voice input , 1989, CHI '89.

[61]  Bernd Fröhlich,et al.  Two-handed direct manipulation on the responsive workbench , 1997, SI3D.

[62]  Naokazu Yokoya,et al.  Manipulation Aid for Two-Handed 3-D Designing Within a Shared Virtual Environment , 1997, HCI.

[63]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[64]  Luis Serra,et al.  Dextrous virtual work , 1996, CACM.

[65]  Yoshiaki Katayama,et al.  VLEGO: a simple two-handed modeling environment based on toy blocks , 1996, VRST.

[66]  C. MacKenzie,et al.  Bimanual Movement Control: Information processing and Interaction Effects , 1984 .

[67]  Poulton Ec,et al.  Unwanted asymmetrical transfer effects with balanced experimental designs. , 1966 .

[68]  I. Scott MacKenzie,et al.  Fitts' Law as a Research and Design Tool in Human-Computer Interaction , 1992, Hum. Comput. Interact..

[69]  Myron W. Krueger Environmental technology: making the real world virtual , 1993, CACM.

[70]  Myron W. Krueger,et al.  Artificial reality II , 1991 .

[71]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[72]  K. A. Provins,et al.  Handwriting, Typewriting and Handedness , 1968, The Quarterly journal of experimental psychology.

[73]  Jeff A. Johnson,et al.  The emergence of graphical user interfaces , 1995 .

[74]  Ken Hinckley,et al.  Real-time system for 3D neurosurgical planning , 1994, Other Conferences.

[75]  Ken Hinckley,et al.  New applications for the touchscreen in 2D and 3D medical imaging workstations , 1995, Medical Imaging.

[76]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.

[77]  Stéphane Chatty,et al.  Issues and experience in designing two-handed interaction , 1994, CHI Conference Companion.