Graspable user interfaces

This dissertation defines and explores Graspable User Interfaces, an evolution of the input mechanisms used in graphical user interfaces (GUIs). A Graspable UI design provides users concurrent access to multiple, specialized input devices which can serve as dedicated physical interface widgets, affording physical manipulation and spatial arrangements. Like conventional GUIs, physical devices function as "handles" or manual controllers for logical functions on widgets in the interface. However, the notion of the Graspable UI builds on current practice in a number of ways. With conventional GUIs, there is typically only one graphical input device, such as a mouse. Hence, the physical handle is necessarily "time-multiplexed," being repeatedly attached and unattached to the various logical functions of the GUI. A significant aspect of the Graspable UI is that there can be more than one input device. Hence input control can then be "space-multiplexed." That is, different devices can be attached to different functions, each independently (but possibly simultaneously) accessible. This, then affords the capability to take advantage of the shape, size and position of the physical controller to increase functionality and decrease complexity. It also means that the potential persistence of attachment of a device to a function can be increased. By using physical objects, we not only allow users to employ a larger expressive range of gestures and grasping behaviors but also to leverage off of a user's innate spatial reasoning skills and everyday knowledge of object manipulations. In this thesis the concept of Graspable user interfaces is defined. Support for the concept is provided from the psychological literature. Instantiations of the concept are found in existing user interfaces. A task analysis of an existing interface's input activities and how to convert these to Graspable user interface devices is presented. The possible uses and implementation difficulties of bricks, a specific Graspable user interface are investigated. Finally, the advantages of two of the Graspable UI properties over conventional time-multiplexed generic input devices is measured by two controlled experiments.

[1]  Abigail Sellen,et al.  Video-Mediated Communication , 1997 .

[2]  Jiajie Zhang,et al.  Representations in Distributed Cognitive Tasks , 1994, Cogn. Sci..

[3]  Donald A. Norman,et al.  Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine , 1993 .

[4]  R. L. Fowler,et al.  AN INVESTIGATION OF THE RELATIONSHIP BETWEEN OPERATOR PERFORMANCE AND OPERATOR PANEL LAYOUT FOR CONTINUOUS TASKS. , 1968 .

[5]  Warren Robinson,et al.  Synthetic Experience: A Proposed Taxonomy , 1992, Presence Teleoperators Virtual Environ..

[6]  Ken Hinckley,et al.  Passive real-world interface props for neurosurgical visualization , 1994, CHI '94.

[7]  William Buxton,et al.  The Chameleon: spatially aware palmtop computers , 1994, CHI '94.

[8]  Stephanie Houde,et al.  Iterative design of an interface for easy 3-D direct manipulation , 1992, CHI.

[9]  George Fitzmaurice,et al.  Bimanual Manipulation in a Curve Editing Task , 1998 .

[10]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[11]  Hal Chamberlin Musical Applications of Microprocessors , 1980 .

[12]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[13]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[14]  William Buxton,et al.  Lexical and pragmatic considerations of input structures , 1983, COMG.

[15]  Itiro Siio InfoBinder: A pointing device for a virtual desktop system , 1995, HCI International 1995.

[16]  Jock D. Mackinlay,et al.  Information visualization using 3D interactive animation , 1991, CHI.

[17]  A. T. Welford,et al.  The fundamentals of skill , 1968 .

[18]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[19]  J. Gibson Observations on active touch. , 1962, Psychological review.

[20]  Thomas Ertl,et al.  Computer Graphics - Principles and Practice, 3rd Edition , 2014 .

[21]  Kimiya Yamaashi,et al.  Object-oriented video: interaction with real-world objects through live video , 1992, CHI.

[22]  Ronald Azuma,et al.  Tracking requirements for augmented reality , 1993, CACM.

[23]  Robert C. Zeleznik,et al.  The Lego interface toolkit , 1996, UIST '96.

[24]  R. Hetherington The Perception of the Visual World , 1952 .

[25]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[26]  Mark S. Sanders,et al.  Human Factors in Engineering and Design , 2016 .

[27]  Ronald M. Baecker,et al.  Readings in human-computer interaction : a multidisciplinary approach , 1988 .

[28]  Mitchel Resnick,et al.  Behavior construction kits , 1993, CACM.

[29]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[30]  Robert N. Singer,et al.  Motor learning and human performance : an application to motor skills and movement behaviors , 1980 .

[31]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[32]  M. Jeannerod,et al.  Constraints on human arm movement trajectories. , 1987, Canadian journal of psychology.

[33]  Louis-Jean Boë,et al.  Visual detection of coarticulatory anticipation or...guessing what has not yet been written , 1993, Proceedings of IEEE Virtual Reality Annual International Symposium.

[34]  Mark Weiser,et al.  Some computer science issues in ubiquitous computing , 1993, CACM.

[35]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[36]  Wendy E. Mackay,et al.  Augmenting reality: adding computational dimensions to paper , 1993, CACM.

[37]  大野 義夫,et al.  Computer Graphics : Principles and Practice, 2nd edition, J.D. Foley, A.van Dam, S.K. Feiner, J.F. Hughes, Addison-Wesley, 1990 , 1991 .

[38]  Jock D. Mackinlay,et al.  The design space of input devices , 1990, CHI '90.

[39]  P. Fitts,et al.  INFORMATION CAPACITY OF DISCRETE MOTOR RESPONSES. , 1964, Journal of experimental psychology.

[40]  Jock D. Mackinlay,et al.  The information visualizer, an information workspace , 1991, CHI.

[41]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.

[42]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[43]  Wendy E. Mackay,et al.  Video mosaic: laying out time in a physical space , 1994, MULTIMEDIA '94.

[44]  David Kirsh,et al.  The Intelligent Use of Space , 1995, Artif. Intell..

[45]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[46]  Michael Gleicher,et al.  A graphics toolkit based on differential constraints , 1993, UIST '93.

[47]  Joseph A. Paradiso,et al.  Applying electric field sensing to human-computer interfaces , 1995, CHI '95.

[48]  Jock D. Mackinlay,et al.  Cone Trees: animated 3D visualizations of hierarchical information , 1991, CHI.

[49]  J. Elliott,et al.  A CLASSIFICATION OF MANIPULATIVE HAND MOVEMENTS , 1984, Developmental medicine and child neurology.

[50]  Brian Knep,et al.  Dinosaur input device , 1995, CHI '95.

[51]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[52]  John J. Leggett,et al.  Interaction styles and input/output devices , 1993, Behav. Inf. Technol..

[53]  Kim J. Vicente,et al.  Ecological interface design: theoretical foundations , 1992, IEEE Trans. Syst. Man Cybern..

[54]  M. Jeannerod The timing of natural prehension movements. , 1984, Journal of motor behavior.

[55]  Paul P. Maglio,et al.  On Distinguishing Epistemic from Pragmatic Action , 1994, Cogn. Sci..

[56]  Andy Hopper,et al.  The active badge location system , 1992, TOIS.

[57]  Steven K. Feiner,et al.  Knowledge-based augmented reality , 1993, CACM.

[58]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[59]  Stanley A. Schneider Experiments in the dynamic and strategic control of cooperating manipulators , 1990 .

[60]  Susan J. Lederman,et al.  The Intelligent Hand: An Experimental Approach to Human-Object Recognition and Implications for Robotics and AI , 1994, AI Mag..

[61]  David Kirsh,et al.  Complementary Strategies: Why we use our hands when we think , 1995 .

[62]  James C. Miller,et al.  Computer graphics principles and practice, second edition , 1992, Comput. Graph..

[63]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[64]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[65]  Jun Rekimoto,et al.  Augmented Interaction: Interacting with the real world through a computer , 1995 .

[66]  William Newman,et al.  A desk supporting computer-based interaction with paper documents , 1992, CHI.

[67]  Wendy E. Mackay,et al.  Computer-Augmented Environments: Back to the Real World - Introduction to the Special Issue. , 1993 .