The Haptic Desktop: a novel 2D multimodal device

As computing gets embedded into our everyday lives and moves from PCs to wide varieties of devices, we need new and flexible ways to interact with technology. Recently, the appearance of multimodal systems has been recognized to offer richer user experiences by combining different natural input modes, i.e., speech, touch, pen, hand gestures, etc.. in a coordinated manner with multimedia system outputs. As a new generation of multimodal systems begins to define itself, one dominant theme is the integration and synchronization requirements for combining different modes strategically into whole systems; where we can distinguish basically three modalities: visual, auditive, and tactile (physiology of senses). In order to provide a broader range of signal integrations, actually limited by the conventional input devices of computers, it becomes necessary to ensure that these interfaces interact in a manner akin to the way human interacts in real environments. Therefore, The design and integration of a novel multimodal device developed at PERCRO is described. This multimodal device, named Haptic Desktop, has been designed to replace the conventional input devices used for accessing the resources of computers. Furthermore, to assure the usability of the interface so that operators can extract efficiently the principal features of each perceptual channel from the device, the graphical visualization has been integrated to the haptic system on a single desktop to satisfy the coherence and co-location designing issues.

[1]  Pierre Wellner,et al.  The DigitalDesk Calculator: Tangible Manipulation on , 1991 .

[2]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[3]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[4]  J. Dean Brederson,et al.  The Visual Haptic Workbench , 2005, The Visualization Handbook.

[5]  Maria C. Yang,et al.  Haptic Force-Feedback Devices for the Office Computer: Performance and Musculoskeletal Loading Issues , 2001, Hum. Factors.

[6]  Jeff Rose,et al.  Rotating virtual objects with real handles , 1999, TCHI.

[7]  Christine L. MacKenzie,et al.  Physical versus virtual pointing , 1996, CHI.

[8]  Antonio Frisoli,et al.  Motor learning skill experiments using haptic interface capabilities , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[9]  Elaine Cohen,et al.  Virtual Prototyping for Human-Centric Design , 1999 .

[10]  Piti Irawan,et al.  Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces , 2001 .

[11]  Haruo Noma,et al.  The Proactive Desk: A New Haptic Display System for a Digital Desk Using a 2-DOF Linear Induction Motor , 2003, Presence: Teleoperators & Virtual Environments.

[12]  Massimo Bergamasco,et al.  Haptic Interfaces: Collocation and Coherence Issues , 2005, Multi-point Interaction with Real and Virtual Objects.

[13]  Mark Wright,et al.  The Effect of Haptic Feedback and Stereo Graphics in a 3D Target Acquisition Task , 2002 .

[14]  G. A. Miller THE PSYCHOLOGICAL REVIEW THE MAGICAL NUMBER SEVEN, PLUS OR MINUS TWO: SOME LIMITS ON OUR CAPACITY FOR PROCESSING INFORMATION 1 , 1956 .

[15]  Massimo Bergamasco,et al.  Teaching to write Japanese characters using a haptic interface , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.