Multimodal feedback in HCI: haptics, non-speech audio, and their applications

Computer interfaces traditionally depend on visual feedback to provide information to users, with large, high-resolution screens the norm. Other sensory modalities, such as haptics and audio, have great potential to enrich the interaction between user and device to enable new types of interaction for new user groups in new contexts. This chapter provides an overview of research in the use of these non-visual modalities for interaction, showing how new output modalities can be used in the user interface to different devices. The modalities that will be discussed include: Haptics: tactons (vibrotactile feedback), thermal (warming and cooling feedback), force feedback, and deformable devices; Non-Speech Audio: auditory icons, Earcons, musicons, sonification, and spatial audio output. One motivation for using multiple modalities in a user interface is that interaction can be distributed across the different senses or control capabilities of the person using it. If one modality is fully utilized or unavailable (e.g., due to sensory or situational impairment), then another can be exploited to ensure the interaction succeeds. For example, when walking and using a mobile phone, a user needs to focus their visual attention on the environment to avoid bumping into other people. A complex visual interface on the phone may make this difficult. However, haptic or audio feedback would allow them to use their phone and navigate the world at the same time. This chapter does not present background on multisensory perception and multimodal action, but for insights on that topic see Chapter 2. Chapter 3 also specifically discuss multisensory haptic interaction and the process of designing for it. As a complement, this chapter presents a range of applications where multimodal feedback that involves haptics or non-speech audio can provide usability benefits, motivated by Wickens' Multiple Resources Theory [Wickens 2002]. The premise of this theory is that tasks can be performed better and with fewer cognitive resources when they are distributed across modalities. For example, when driving, which is a largely visual task, route guidance is better presented through sound rather than a visual display, as that would compete with the driving for visual cognitive resources. Making calls or texting while driving, both manual tasks, would be more difficult to perform compared to voice dialing, as speech and manual input involve different modalities. For user interface design, it is important to distribute different tasks across modalities to ensure the user is not overloaded so that interaction can succeed.

[1]  Klaus Bengler,et al.  Eye Gaze Studies Comparing Head-Up and Head-Down Displays in Vehicles , 2007, 2007 IEEE International Conference on Multimedia and Expo.

[2]  Motoyuki Akamatsu,et al.  Movement characteristics using a mouse with tactile and force feedback , 1996, Int. J. Hum. Comput. Stud..

[3]  Frank E. Pollick,et al.  To Beep or Not to Beep?: Comparing Abstract versus Language-Based Multimodal Driver Displays , 2015, CHI.

[4]  S. Izadi,et al.  FlexCase: Enhancing Mobile Interaction with a Flexible Sensing and Display Cover , 2016, CHI.

[5]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[6]  Vaibhava Goel,et al.  Audio and visual modality combination in speech processing applications , 2017, The Handbook of Multimodal-Multisensor Interfaces, Volume 1.

[7]  Dario D. Salvucci Predicting the effects of in-car interface use on driver performance: an integrated model approach , 2001, Int. J. Hum. Comput. Stud..

[8]  Jan O. Borchers,et al.  FingerFlux: near-surface haptic feedback on tabletops , 2011, UIST.

[9]  P. Sines,et al.  Peltier Haptic Interface (PHI) for improved sensation of touch in virtual environments , 2005, Virtual Reality.

[10]  Roderick Murray-Smith,et al.  Shoogle: excitatory multimodal interaction on mobile devices , 2007, CHI.

[11]  Tamotsu Murakami,et al.  Direct and intuitive input device for 3-D shape deformation , 1994, CHI '94.

[12]  Petros Maragos,et al.  Multimodal gesture recognition , 2017, The Handbook of Multimodal-Multisensor Interfaces, Volume 1.

[13]  Hiroyuki Shinoda,et al.  Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound , 2008, EuroHaptics.

[14]  Gregory Kramer,et al.  Auditory Display: Sonification, Audification, And Auditory Interfaces , 1994 .

[15]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[16]  Hiroshi Ishii,et al.  Direct and gestural interaction with relief: a 2.5D shape display , 2011, UIST '11.

[17]  Helge J. Ritter,et al.  Sound and meaning in auditory data display , 2004, Proceedings of the IEEE.

[18]  Bruce N. Walker,et al.  Listener, Task, and Auditory Graph: Toward a Conceptual Model of Auditory Graph Comprehension , 2007 .

[19]  Hiroshi Ishii,et al.  inFORM: dynamic physical affordances and constraints through shape and object actuation , 2013, UIST.

[20]  Patrick Baudisch,et al.  Skin Drag Displays: Dragging a Physical Tactor across the User's Skin Produces a Stronger Tactile Stimulus than Vibrotactile , 2015, CHI.

[21]  A. A. Collins,et al.  Vibrotactile localization on the arm: Effects of place, space, and age , 2003, Perception & psychophysics.

[22]  Hiroshi Ishii,et al.  Radical atoms: beyond tangible bits, toward transformable materials , 2012, INTR.

[23]  Stephen A. Brewster,et al.  Designing audio and tactile crossmodal icons for mobile devices , 2007, ICMI '07.

[24]  Stephen A. Brewster,et al.  New parameters for tacton design , 2007, CHI Extended Abstracts.

[25]  Martin Halvey,et al.  Some like it hot: thermal feedback for mobile devices , 2011, CHI.

[26]  Frank E. Pollick,et al.  Language-based multimodal displays for the handover of control in autonomous cars , 2015, AutomotiveUI.

[27]  Beryl Plimmer,et al.  Multimodal collaborative handwriting training for visually-impaired people , 2008, CHI.

[28]  Jeremy R. Cooperstock,et al.  What's around Me? Spatialized Audio Augmented Reality for Blind Users with a Smartphone , 2011, MobiQuitous.

[29]  Stephen A. Brewster,et al.  Comparing two haptic interfaces for multimodal graph rendering , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[30]  M. Sile O'Modhrain,et al.  GpsTunes: controlling navigation via audio feedback , 2005, Mobile HCI.

[31]  Christopher Miller,et al.  Olfoto: designing a smell-based interaction , 2006, CHI.

[32]  Stephen A. Brewster,et al.  In the Heat of the Moment: Subjective Interpretations of Thermal Feedback During Interaction , 2015, CHI.

[33]  Tapio Lokki,et al.  Eyes-free interaction with free-hand gestures and auditory menus , 2012, Int. J. Hum. Comput. Stud..

[34]  Sandra Pauletto,et al.  A comparision of audio & visual analysis of complex time-series data sets , 2005 .

[35]  Stephen A. Brewster,et al.  Setting the Standards for Haptic and Tactile Interactions: ISO's Work , 2010, EuroHaptics.

[36]  Carryl L. Baldwin,et al.  Multimodal urgency coding: auditory, visual, and tactile parameters and their impact on perceived urgency. , 2012, Work.

[37]  Lorna M. Brown,et al.  A first investigation into the effectiveness of Tactons , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[38]  Alexander Keith Eady,et al.  One-Handed Bend Interactions with Deformable Smartphones , 2015, CHI.

[39]  Bruce N. Walker,et al.  Mappings and metaphors in auditory displays: An experimental assessment , 2005, TAP.

[40]  Stephen Brewster,et al.  A Detailed Investigation into the Effectiveness of Earcons , 1997 .

[41]  Shwetak N. Patel,et al.  SideSwipe: detecting in-air gestures around mobile devices using actual GSM signal , 2014, UIST.

[42]  Sriram Subramanian,et al.  Tilt displays: designing display surfaces with multi-axis tilting and actuation , 2012, Mobile HCI.

[43]  Sile O'Modhrain,et al.  The bird's ear view of space physics: Audification as a tool for the spectral analysis of time series data , 2014 .

[44]  Lorna M. Brown,et al.  Tactile feedback for mobile interactions , 2007, CHI.

[45]  Stephen Barrass,et al.  Using sonification , 1999, Multimedia Systems.

[46]  Stephen A. Brewster,et al.  Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions , 2014, ICMI.

[47]  Niels Henze,et al.  Tactile wayfinder: a non-visual support system for wayfinding , 2008, NordiCHI.

[48]  Stephen A. Brewster,et al.  Crossmodal icons for information display , 2006, CHI EA '06.

[49]  Antti Jylhä,et al.  A Wearable Multimodal Interface for Exploring Urban Points of Interest , 2015, ICMI.

[50]  Stephen Brewster,et al.  The sound of musicons: investigating the design of musically derived audio cues , 2012 .

[51]  William W. Gaver The SonicFinder: An Interface That Uses Auditory Icons , 1989, Hum. Comput. Interact..

[52]  David R. Morse,et al.  AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface , 2002, Personal and Ubiquitous Computing.

[53]  Alexander Toet,et al.  Uni-, bi- and tri-modal warning signals: effects of temporal parameters and sensory modality on perceived urgency , 2015 .

[54]  Andrew Sears,et al.  An empirical comparison of use-in-motion evaluation scenarios for mobile computing devices , 2005, Int. J. Hum. Comput. Stud..

[55]  Stephen A. Brewster,et al.  Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems , 2016, CHI.

[56]  Roberto Manduchi,et al.  Vibrotactile Guidance for Wayfinding of Blind Walkers , 2015, IEEE Transactions on Haptics.

[57]  Stephen Brewster,et al.  Web-based Multimodal Graphs for Visually Impaired People , 2002 .

[58]  Hannes Bleuler,et al.  Encoded and Crossmodal Thermal Stimulation through a Fingertip-Sized Haptic Display , 2015, Front. Robot. AI.

[59]  Jun Rekimoto,et al.  Ambient touch: designing tactile interfaces for handheld devices , 2002, UIST '02.

[60]  Stephen A. Brewster,et al.  Towards In-Air Gesture Control of Household Appliances with Limited Displays , 2015, INTERACT.

[61]  Hiroshi Ishii,et al.  Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices , 2012, UIST.

[62]  H Summala,et al.  Driving experience and perception of the lead car's braking when looking at in-car targets. , 1998, Accident; analysis and prevention.

[63]  Philip R. Cohen,et al.  Multimodal speech and pen interfaces , 2017, The Handbook of Multimodal-Multisensor Interfaces, Volume 1.

[64]  Hong Z. Tan,et al.  Efficient Multimodal Cuing of Spatial Attention , 2013, Proceedings of the IEEE.

[65]  Jennifer Pearson,et al.  Emergeables: Deformable Displays for Continuous Eyes-Free Mobile Interaction , 2016, CHI.

[66]  Alexandra Neukum,et al.  The effect of urgency of take-over requests during highly automated driving under distraction conditions , 2014 .

[67]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[68]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[69]  Bruce N. Walker,et al.  Psychophysical scaling of sonification mappings , 2000 .

[70]  Karon E. MacLean,et al.  Haptic Interaction Design for Everyday Interfaces , 2008 .

[71]  Beryl Plimmer,et al.  Signing on the tactile line: A multimodal system for teaching handwriting to blind children , 2011, TCHI.

[72]  Ian Oakley,et al.  Putting the feel in ’look and feel‘ , 2000, CHI.

[73]  Roberta L. Klatzky,et al.  Nonvisual Route following with Guidance from a Simple Haptic or Auditory Display , 2007 .

[74]  Meredith Ringel Morris,et al.  Touchplates: low-cost tactile overlays for visually impaired touch screen users , 2013, ASSETS.

[75]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[76]  David McGookin,et al.  Graph Builder: Constructing Non-visual Visualizations , 2007 .

[77]  H Alm,et al.  Changes in driver behaviour as a function of handsfree mobile phones--a simulator study. , 1994, Accident; analysis and prevention.

[78]  Gregory Kramer,et al.  Sonification and the interaction of perceptual dimensions: Can the data get lost in the map? , 2000 .

[79]  Markus Löchtefeld,et al.  Morphees: toward high "shape resolution" in self-actuated flexible mobile devices , 2013, CHI.

[80]  Frank E. Pollick,et al.  Evaluating multimodal driver displays of varying urgency , 2013, AutomotiveUI.

[81]  Stephen A. Brewster,et al.  Transient and transitional states: pressure as an auxiliary input modality for bimanual interaction , 2014, CHI.

[82]  Martin Halvey,et al.  Thermal icons: evaluating structured thermal feedback for mobile interaction , 2012, Mobile HCI.

[83]  Gerald L. Lohse Models of Graphical Perception , 1997 .

[84]  R. Cholewiak,et al.  Vibrotactile localization on the abdomen: Effects of place and space , 2004, Perception & psychophysics.

[85]  James D. Hollan,et al.  Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors , 2008, UIST '08.

[86]  M. Tscheligi,et al.  Experiencing Autonomous Vehicles: Crossing the Boundaries between a Drive and a Ride , 2015, CHI Extended Abstracts.

[87]  Frank E. Pollick,et al.  Evaluating multimodal driver displays under varying situational urgency , 2014, CHI.

[88]  Daniel Thalmann,et al.  A wearable system for mobility improvement of visually impaired people , 2007, The Visual Computer.

[89]  Stephen A. Brewster,et al.  Understanding concurrent earcons: Applying auditory scene analysis principles to concurrent earcon recognition , 2004, TAP.

[90]  Martin Halvey,et al.  Thermal Feedback Identification in a Mobile Environment , 2013, HAID.

[91]  Maria Klara Wolters,et al.  Name that tune: musicons as reminders in the home , 2011, CHI.

[92]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[93]  Ivan Poupyrev,et al.  Gummi: a bendable computer , 2004, CHI '04.

[94]  Charles M. Higgins,et al.  A Navigation Aid for the Blind Using Tactile-Visual Sensory Substitution , 2006, 2006 International Conference of the IEEE Engineering in Medicine and Biology Society.

[95]  Jan Stage,et al.  New techniques for usability evaluation of mobile systems , 2004, Int. J. Hum. Comput. Stud..

[96]  Charles Spence,et al.  The Multisensory Driver: Implications for Ergonomic Car Interface Design , 2012 .

[97]  Lorna M. Brown,et al.  Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.

[98]  Valtteri Wikström,et al.  SoundFLEX: Designing Audio to Guide Interactions with Shape-Retaining Deformable Interfaces , 2014, ICMI.

[99]  K E Barner,et al.  Design of a haptic data visualization system for people with visual impairments. , 1999, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[100]  Stephen A. Brewster,et al.  Mapping information to audio and tactile icons , 2009, ICMI-MLMI '09.

[101]  J.B.F. van Erp,et al.  Vibro-Tactile Information Presentation in Automobiles , 2001 .

[102]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[103]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[104]  Stephen A. Brewster,et al.  Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.

[105]  Stephen A. Brewster,et al.  Using nonspeech sounds to provide navigation cues , 1998, TCHI.

[106]  Motoyuki Akamatsu,et al.  A multi-modal mouse with tactile and force feedback , 1994, Int. J. Hum. Comput. Stud..

[107]  Meera Blattner,et al.  Earcons and Icons: Their Structure and Common Design Principles , 1989, Hum. Comput. Interact..

[108]  Stephen A. Brewster,et al.  Web-based haptic applications for blind people to create virtual graphs , 2003, 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings..

[109]  Riender Happee,et al.  Public opinion on automated driving: results of an international questionnaire among 5000 respondents , 2015 .

[110]  Stephen A. Brewster,et al.  The Impact of Encumbrance on Mobile Interactions , 2013, INTERACT.

[111]  Roel Vertegaal,et al.  PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays , 2011, CHI.

[112]  Cagatay Basdogan,et al.  Haptics in virtual environments: taxonomy, research status, and challenges , 1997, Comput. Graph..

[113]  Seungmoon Choi,et al.  Vibrotactile Display: Perception, Technology, and Applications , 2013, Proceedings of the IEEE.

[114]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[115]  Stephen A. Brewster,et al.  Clutching at straws: using tangible interaction to provide non-visual access to graphs , 2010, CHI.

[116]  Stephen A. Brewster,et al.  Investigating the effects of encumbrance on one- and two- handed interactions with mobile devices , 2014, CHI.

[117]  Hiroshi Ishii,et al.  PneUI: pneumatically actuated soft composite materials for shape changing interfaces , 2013, UIST.

[118]  Stephen A. Brewster,et al.  Multimodal Trajectory Playback for Teaching Shape Information and Trajectories to Visually Impaired Computer Users , 2008, TACC.

[119]  C. Spence,et al.  Assessing the effectiveness of various auditory cues in capturing a driver's visual attention. , 2005, Journal of experimental psychology. Applied.

[120]  Helge Ritter,et al.  Listen to your Data: Model-Based Sonification for Data Analysis , 1999 .