What You See Is What You Feel: On the simulation of touch in graphical user interfaces

This study introduces a novel method of simulating touch with merely visual means. Interactive animations are used to create an optical illusion that evokes haptic percepts like stickiness, stiffness and mass, within a standard graphical user interface. The technique, called optically simulated hapic feedback, exploits the domination of the visual over the haptic modality and the general human tendency to integrate between the various senses. The study began with an aspiration to increase the sensorial qualities of the graphical user interface. With the introduction of the graphical user interface – and in particular the desktop metaphor – computers have become accessible for almost anyone; all over the world, people from various cultures use the same icons, folders, buttons and trashcans. However, from a sensorial point of view this computing paradigm is still extremely limited. Touch can play a powerful role in communication. It can offer an immediacy and intimacy unparalleled by words or images. Although few doubt this intrinsic value of touch perception in everyday life, examples in modern technology where human-machine communication utilizes the tactile and kinesthetic senses as additional channels of information flow are scarce. Hence, it has often been suggested that improvements in the sensorial qualities of computers could lead to more natural interfaces. Various researchers have been creating scenarios and technologies that should enrich the sensorial qualities of our digital environment. Some have developed mechanical force feedback devices that enable people to experience haptics while interacting with a digital display. Others have suggested that the computer should ‘disappear’ into the environment and proposed tangible objects as a means to connect between the digital and the physical environment. While the scenarios of force feedback, tangible interactions and the disappearing computer are maturing, millions of people are still working with a desktop computer interface every day. In spite of its obvious drawbacks, the desktop computing model penetrated deeply into our society and cannot be expected to disappear overnight. Radically different computing paradigms will require the development of radically different hardware. This takes time and it is yet unsure when, if so, other computing paradigms will replace the current desktop computing setup. It is for that reason, that we pursued another approach towards physical computing. Inspired by renaissance painters, who already centuries ago invented illusionary techniques like perspective and trompe d’oeil to increase the presence of their paintings, we aim to improve the physicality of the graphical user interface, without resorting to special hardware. Optically simulated haptic feedback, described in this thesis, has a lot in common with mechanical force-feedback systems, except for the fact that in mechanical force-feedback systems the location of the cursor is manipulated as a result of the force sent to the haptic device (force-feedback mouse, trackball, etc), whereas in our system the cursor location is directly manipulated, resulting in an purely visual force feedback. By applying tiny displacements upon the cursor’s movement, tactile sensations like stickiness, touch, or mass can be simulated. In chapter 2 we suggest that the active cursor technique can be applied to create richer interactions without the need for special hardware. The cursor channel is transformed from an input only to an input/output channel. The active cursor displacements can be used to create various (dynamic) slopes as well as textures and material properties, which can provide the user with feedback while navigating the on-screen environment. In chapter 3 the perceptual illusion of touch, resulting from the domination of the visual over the haptic modality, is described in a larger context of prior research and experimentally tested. Using both the active cursor technique and a mechanical force feedback device, we generated bumps and hole structures. In a controlled experiment the perception of the slopes was measured, comparing between the optical and the mechanical simulation. Results show that people can recognize optically simulated bump and hole structures, and that active cursor displacements influence the haptic perception of bumps and holes. Depending on the simulated strength of the force, optically simulated haptic feedback can take precedence over mechanically simulated haptic feedback, but also the other way around. When optically simulated and mechanically simulated haptic feedback counteract each other, however, the weight attributed to each source of haptic information differs between users. It is concluded that active cursor displacements can be used to optically simulate the operation of mechanical force feedback devices. An obvious application of optically simulated haptic feedback in graphical user interfaces, is to assist the user in pointing at icons and objects on the screen. Given the pervasiveness of pointing in graphical interfaces, every small improvement in a target-acquisition task, represents a substantial improvement in usability. Can active cursor displacements be applied to help the user reach its goal? In chapter 4 we test the usability of optically simulated haptic feedback in a pointing task, again in comparison with the force feedback generated by a mechanical device. In a controlled Fitts’-law type experiment, subjects were asked to point and click at targets of different sizes and distances. Results learn that rendering hole type structures underneath the targets improves the effectiveness, efficiency and satisfaction of the target acquisition task. Optically simulated haptic feedback results in lower error rates, more satisfaction, and a higher index of performance, which can be attributed to the shorter movement times realized for the smaller targets. For larger targets, optically simulated haptic feedback resulted in comparable movement times as mechanically simulated haptic feedback. Since the current graphical interfaces are not designed with tactility in mind, the development of novel interaction styles should also be an important research path. Before optically simulated haptic feedback can be fully brought into play in more complex interaction styles, designers and researchers need to further experiment with the technique. In chapter 5 we describe a software prototyping toolkit, called PowerCursor, which enables designers to create interaction styles using optically simulated haptic feedback, without having to do elaborate programming. The software engine consists of a set of ready force field objects – holes, hills, ramps, rough and slick objects, walls, whirls, and more – that can be added to any Flash project, as well as force behaviours that can be added to custom made shapes and objects. These basic building blocks can be combined to create more complex and dynamic force objects. This setup should allow the users of the toolkit to creatively design their own interaction styles with optically simulated haptic feedback. The toolkit is implemented in Adobe Flash and can be downloaded at www.powercursor.com. Furthermore, in chapter 5 we present a preliminary framework of the expected applicability of optically simulated haptic feedback. Illustrated with examples that have been created with the beta-version of the PowerCursor toolkit so far, we discuss some of the ideas for novel interaction styles. Besides being useful in assisting the user while navigating, optically simulated haptic feedback might be applied to create so-called mixed initiative interfaces – one can for instance think of an installation wizard, which guides the cursor towards the recommended next step. Furthermore since optically simulated haptic feedback can be used to communicate material properties of textures or 3D objects, it can be applied to create aesthetically pleasing interactions – which with the migration of computers into other domains than the office environment are becoming more relevant. Finally we discuss the opportunities for applications outside the desktop computer model. We discuss how, in principle, optically simulated haptic feedback can play a role in any graphical interface where the input and output channels are decoupled. In chapter 6 we draw conclusions and discuss future directions. We conclude that optically simulated haptic feedback can increase the physicality and quality of our current graphical user interfaces, without resorting to specialistic hardware. Users are able to recognize haptic structures simulated by applying active cursor displacements upon the users mouse movements. Our technique of simulating haptic feedback optically opens up an additional communication channel with the user that can enhance the usability of the graphical interface. However, the active cursor technique is not to be expected to replace mechanical haptic feedback altogether, since it can be applied only in combination with a visual display and thus will not work for visually impaired people. Rather, we expect the ability to employ tactile interaction styles in a standard graphical user interface, could catalyze the development of novel physical interaction styles and on the long term might instigate the acceptance of haptic devices. With this research we hope to have contributed to a more sensorial and richer graphical user interface. Moreover we have aimed to increase our awareness and understanding of media technology and simulations in general. Therefore, our scientific research results are deliberately presented within a social-cultural context that reflects upon the dominance of the visual modality in our society and the ever-increasing role of media and simulations in people’s everyday lives.

[1]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[2]  G. Sperling,et al.  The functional architecture of human visual motion perception , 1995, Vision Research.

[3]  A. F. Rovers,et al.  HIM: a framework for haptic instant messaging , 2004, CHI EA '04.

[4]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[5]  S. Rendón Understanding media : the extensions of man , 1967 .

[6]  Vincent Hayward,et al.  Force can overcome object geometry in the perception of shape through active touch , 2001, Nature.

[7]  Kellogg S. Booth,et al.  Fitts''Law Studies of Directional Mouse Movement , 1991 .

[8]  Brad A. Myers,et al.  A brief history of human-computer interaction technology , 1998, INTR.

[9]  Ronald M. Baecker,et al.  Readings in human-computer interaction : toward the year 2000 , 1995 .

[10]  S. Blair Hedges,et al.  Human evolution: A start for population genomics , 2000, Nature.

[11]  G. W. Furnas,et al.  Generalized fisheye views , 1986, CHI '86.

[12]  Louis Rosenberg,et al.  Using force feedback to enhance human performance in graphical user interfaces , 1996, CHI 1996.

[13]  Peter J. Denning,et al.  The invisible future: the seamless integration of technology into everyday life , 2001 .

[14]  William Buxton,et al.  There's more to interaction than meets the eye: some issues in manual input , 1987 .

[15]  Dv David Keyson,et al.  Touch in user interface navigation , 1997 .

[16]  Douglas C. Engelbart,et al.  Display-Selection Techniques for Text Manipulation , 1967 .

[17]  Jock D. Mackinlay,et al.  Cone Trees: animated 3D visualizations of hierarchical information , 1991, CHI.

[18]  A. Charpentier Experimental study of some aspects of weight perception , 1891 .

[19]  M MichaelJohn,et al.  Fitts ’ Law and Expanding Targets : An Experimental Study , and Applications to User Interface Design , 2022 .

[20]  Peter Hall,et al.  Design and the Elastic Mind , 2008 .

[21]  Rupert England,et al.  Simulated and virtual realities: elements of perception , 1995 .

[22]  Ian Oakley,et al.  Putting the feel in ’look and feel‘ , 2000, CHI.

[23]  Motoyuki Akamatsu,et al.  Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display , 1994, Presence: Teleoperators & Virtual Environments.

[24]  Frederick P. Brooks,et al.  Project GROPEHaptic displays for scientific visualization , 1990, SIGGRAPH.

[25]  J. C. Welty,et al.  The Life of Birds , 1963 .

[26]  Hilary Apfelstadt,et al.  Less Is More (More or Less!) , 1994 .

[27]  Lars Erik Holmquist,et al.  WebStickers: using physical tokens to access, manage and share bookmarks to the Web , 2000, DARE '00.

[28]  Brenda Laurel,et al.  Computers as theatre , 1991 .

[29]  M. J. Muller Multifunctional cursor for direct manipulation user interfaces , 1988, CHI '88.

[30]  David Ahlström,et al.  Modeling and improving selection in cascading pull-down menus using Fitts' law, the steering law and force fields , 2005, CHI.

[31]  D. W. Massaro,et al.  An Information-Processing Analysis of Perception and Action , 1990 .

[32]  B. J. Fogg,et al.  Persuasive computers: perspectives and research directions , 1998, CHI.

[33]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[34]  Lev Manovich Mobile Phone as Gesamtkunstwerk , 2008 .

[35]  Abderrahmane Kheddar,et al.  Pseudo-haptic feedback: can isometric input devices simulate force feedback? , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).

[36]  Ravin Balakrishnan,et al.  Keepin' it real: pushing the desktop metaphor with physics, piles and the pen , 2006, CHI.

[37]  Wolfgang Prinz,et al.  Modes of Linkage Between Perception and Action , 1984 .

[38]  R. Gregory Eye and Brain: The Psychology of Seeing , 1966 .

[39]  Jack Tigh Dennerlein,et al.  Force-feedback improves performance for steering and combined steering-targeting tasks , 2000, CHI.

[40]  Ravin Balakrishnan,et al.  Acquisition of expanding targets , 2002, CHI.

[41]  Motoyuki Akamatsu,et al.  A multi-modal mouse with tactile and force feedback , 1994, Int. J. Hum. Comput. Stud..

[42]  Guy Debord The Society of the Spectacle , 2020 .

[43]  Richard Heeks Current Analysis and Future Research Agenda on “Gold Farming”: Real-World Production in Developing Countries for the Virtual Economies of Online Games , 2008 .

[44]  Patrick Baudisch,et al.  Snap-and-go: helping users align objects without the modality of traditional snapping , 2005, CHI.

[45]  David H. A. Fitch,et al.  Primate evolution at the DNA level and a classification of hominoids , 1990, Journal of Molecular Evolution.

[46]  Benjamin B. Bederson,et al.  Fisheye menus , 2000, UIST '00.

[47]  L. Marks The Unity of the Senses , 1978 .

[48]  Eric Horvitz,et al.  Principles of mixed-initiative user interfaces , 1999, CHI '99.

[49]  C. Nevejan Presence and the design of trust , 2007 .

[50]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[51]  Hilde Keuning-Van Oirschot,et al.  Cursor Trajectory Analysis , 2000, Haptic Human-Computer Interaction.

[52]  I. Scott MacKenzie,et al.  Extending Fitts' law to two-dimensional tasks , 1992, CHI.

[53]  G. Rizzolatti,et al.  The mirror-neuron system. , 2004, Annual review of neuroscience.

[54]  Manfred Tscheligi,et al.  CHI '04 Extended Abstracts on Human Factors in Computing Systems , 2004, CHI 2004.

[55]  Sidney S. Simon,et al.  Merging of the Senses , 2008, Front. Neurosci..

[56]  S. Lederman,et al.  Perception of texture by vision and touch: multidimensionality and intersensory integration. , 1986, Journal of experimental psychology. Human perception and performance.

[57]  W. IJsselsteijn,et al.  Elements of a multi-level theory of presence: phenomenology, mental processing and neural correlates , 2002 .

[58]  B. Joseph Pine,et al.  The Experience Economy , 2020, Journal of Orthopaedic Experience & Innovation.

[59]  Elise van den Hoven,et al.  Tangible Computing in Everyday Life: Extending Current Frameworks for Tangible User Interfaces with Personal Objects , 2004, EUSAI.

[60]  M. Heller,et al.  Intersensory conflict between vision and touch: The response modality dominates when precise, attention-riveting judgments are required , 1999, Perception & psychophysics.

[61]  Renaud Blanch,et al.  Semantic pointing: improving target acquisition with control-display ratio adaptation , 2004, CHI.

[62]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[63]  Bruce H. Thomas,et al.  Animating direct manipulation interfaces , 1995, UIST '95.

[64]  Motoyuki Akamatsu,et al.  Movement characteristics using a mouse with tactile and force feedback , 1996, Int. J. Hum. Comput. Stud..

[65]  MacKenzie Is A Note on the Information-Theoretic Basis for Fitts’ Law , 1989 .

[66]  Dv David Keyson Dynamic control gain and tactile feedback in the capture of cursor movements , 1994 .

[67]  G. W. Teuscher The hand that rocks the cradle... , 1988, ASDC journal of dentistry for children.

[68]  A. T. Welford,et al.  THE MEASUREMENT OF SENSORY-MOTOR PERFORMANCE : SURVEY AND REAPPRAISAL OF TWELVE YEARS' PROGRESS , 1960 .

[69]  Paul Kabbash,et al.  The “prince” technique: Fitts' law and selection using area cursors , 1995, CHI '95.

[70]  Wolfgang Prinz,et al.  Experimental approaches to action , 2003 .

[71]  Judee K. Burgoon,et al.  Relational message interpretations of touch, conversational distance, and posture , 1991 .

[72]  G. Frank Scratching the surface. , 1987, Nursing times.

[73]  Scott Brave,et al.  inTouch: a medium for haptic interpersonal communication , 1997, CHI Extended Abstracts.

[74]  Matthew Lombard,et al.  At the Heart of It All: The Concept of Presence , 2006 .

[75]  Murata Atsuo Discussion on method for predicting targets in pointing by mouse , 1995 .

[76]  G. Lakoff,et al.  Metaphors We Live by , 1982 .

[77]  George Mather,et al.  Foundations of Perception , 2006 .

[78]  F. L. Engel,et al.  Improved efficiency through I- and E-feedback: a trackball with contextual force feedback , 1994, Int. J. Hum. Comput. Stud..

[79]  S. Fotios Book Review: Eye and brain; the psychology of seeing, 5th Edition , 2004 .

[80]  Bruce H. Thomas,et al.  Applying cartoon animation techniques to graphical user interfaces , 2001, TCHI.

[81]  Gerhard Leitner,et al.  An evaluation of sticky and force enhanced targets in multi target situations , 2006, NordiCHI '06.

[82]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[83]  Krishna Bharat,et al.  Making computers easier for older adults to use: area cursors and sticky icons , 1997, CHI.

[84]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[85]  H E Ross,et al.  Charpentier (1891) on the size—weight illusion , 1999, Perception & psychophysics.

[86]  Gerhard Leitner,et al.  Improving Mouse Navigation - A Walk through the "Hilly Screen Landscape" , 2002, DSV-IS.

[87]  R. Klatzky,et al.  There's more to touch than meets the eye: The salience of object attributes for haptics with and without vision. , 1987 .

[88]  Atsuo Murata,et al.  Improvement of Pointing Time by Predicting Targets in Pointing With a PC Mouse , 1998, Int. J. Hum. Comput. Interact..

[89]  Ergonomic requirements for office work with visual display terminals ( VDTs ) — Part 11 : Guidance on usability , 1998 .

[90]  M Akamatsu,et al.  Please Scroll down for Article Ergonomics a Comparison of Tactile, Auditory, and Visual Feedback in a Pointing Task Using a Mouse-type Device , 2022 .

[91]  J. Durlak The Language of New Media , 2002 .

[92]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[93]  Joep W. Frens,et al.  Tangible products: redressing the balance between appearance and action , 2004, Personal and Ubiquitous Computing.

[94]  R J Jagacinski,et al.  Fitts' Law in two dimensions with hand and head movements. , 1983, Journal of motor behavior.

[95]  Brian W. Epps Comparison of Six Cursor Control Devices Based on Fitts' Law Models , 1986 .

[96]  P. Fitts,et al.  INFORMATION CAPACITY OF DISCRETE MOTOR RESPONSES. , 1964, Journal of experimental psychology.

[97]  Edgar A. Whitley,et al.  The Construction of Social Reality , 1999 .

[98]  I ROCK,et al.  Vision and Touch: An Experimentally Created Conflict between the Two Senses , 1964, Science.

[99]  Maureen C. Stone,et al.  Snap-dragging , 1986, SIGGRAPH.

[100]  M. Holly The psychology of perspective and renaissance art , 1989 .

[101]  Marilyn Tremaine,et al.  Bullseye! when Fitts' law doesn't fit , 1998, CHI '98.

[102]  Ravin Balakrishnan,et al.  "Beating" Fitts' law: virtual enhancements for pointing facilitation , 2004, Int. J. Hum. Comput. Stud..

[103]  Christophe Ramstein A multimodal user interface system with force feedback and physical models , 1995, INTERACT.

[104]  Bay-Wei Chang,et al.  Animation: from cartoons to the user interface , 1993, UIST '93.

[105]  Mark Weiser,et al.  Designing Calm Technology , 2004 .

[106]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[107]  R. Ruiz,et al.  Sein und zeit , 2007 .

[108]  Dik J. Hermes,et al.  Usability of optically simulated haptic feedback , 2008, Int. J. Hum. Comput. Stud..

[109]  Manfred Tscheligi,et al.  The FeelMouse: Making Computer Screens Feelable , 1994, ICCHP.

[110]  David Ahlstr Modeling and Improving Selection in Cascading Pull›Down Menus Using Fitts' Law, the Steering Law and Force Fields , 2005 .

[111]  Mark Weiser,et al.  The world is not a desktop , 1994, INTR.

[112]  Donald D. Hoffman,et al.  Visual Intelligence: How We Create What We See , 1998 .

[113]  Jerold L. Hale,et al.  Relational Messages Associated with Nonverbal Behaviors. , 1984 .

[114]  Douglas C. Engelbart,et al.  A research center for augmenting human intellect , 1968, AFIPS Fall Joint Computing Conference.

[115]  David García The end of the browser , 2001, INTR.

[116]  Laurent Étienne,et al.  Feeling bumps and holes without a haptic interface: the perception of pseudo-haptic textures , 2004, CHI.

[117]  Sung H. Han,et al.  Evaluation of cursor capturing functions in a target positioning task , 2006 .

[118]  C D Wickens,et al.  Compatibility and Resource Competition between Modalities of Input, Central Processing, and Output , 1983, Human factors.

[119]  S. Runeson,et al.  Visual perception of lifted weight. , 1981, Journal of experimental psychology. Human perception and performance.

[120]  Christopher D. Wickens,et al.  The Proximity Compatibility Principle: Its Psychological Foundation and Relevance to Display Design , 1995, Hum. Factors.