When Content Matters: The Role of Processing Code in Tactile Display Design

The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. Recently, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. However, it is not well known to what extent the interpretation of iconic tactile patterns is affected by another attribute of information: the information processing codes of concurrent tasks. In two driving simulation studies (n = 25 for each), participants decoded icons composed of either spatial or nonspatial patterns of vibrations (engaging spatial and nonspatial processing code resources, respectively) while concurrently interpreting spatial or nonspatial visual task stimuli. As predicted by Multiple Resource Theory, performance was significantly worse (approximately 5-10 percent worse) when the tactile icons and visual tasks engaged the same processing code, with the overall worst performance in the spatial-spatial task pairing. The findings from these studies contribute to an improved understanding of information processing and can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that competition for processing code resources warrants consideration, alongside other factors such as the naturalness of signal-message mapping, when designing iconic tactile displays. Nonspatially encoded tactile icons may be preferable in environments which already rely heavily on spatial processing, such as car cockpits.

[1]  Thomas K. Ferris,et al.  Using Informative Peripheral Visual and Tactile Cues to Support Task and Interruption Management , 2009, Hum. Factors.

[2]  Nadine B. Sarter,et al.  Good Vibrations: Tactile Feedback in Support of Attention Allocation and Human-Automation Coordination in Event-Driven Domains , 1999, Hum. Factors.

[3]  W. Marsden I and J , 2012 .

[4]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[5]  Christian B. Carstens,et al.  Comparison of Army Hand and Arm Signals to a Covert Tactile Communication System in a Dynamic Environment , 2006 .

[6]  A H Rupert An instrumentation solution for reducing spatial disorientation mishaps. , 2000, IEEE engineering in medicine and biology magazine : the quarterly magazine of the Engineering in Medicine & Biology Society.

[7]  J.B.F. van Erp,et al.  Guidelines for the use of vibro-tactile displays in human computer interaction , 2002 .

[8]  Karon E. MacLean,et al.  Designing Large Sets of Haptic Icons with Rhythm , 2008, EuroHaptics.

[9]  Lynette A. Jones,et al.  Tactile display and vibrotactile pattern recognition on the torso , 2006, Adv. Robotics.

[10]  J. V. Erp,et al.  Vibrotactile in-vehicle navigation system , 2004 .

[11]  Hong Z. Tan,et al.  The Body Surface as a Communication System: The State of the Art after 50 Years , 2007, PRESENCE: Teleoperators and Virtual Environments.

[12]  A. Fraioli,et al.  Sensation magnitude of vibrotactile stimuli , 1969 .

[13]  Paul Bach-y-Rita,et al.  Tactile displays , 1995 .

[14]  F A GELDARD,et al.  Some neglected possibilities of communication. , 1960, Science.

[15]  Joshua D. Hoffman,et al.  Collision warning design to mitigate driver distraction , 2004, CHI.

[16]  P. Bach-y-Rita,et al.  Simultaneous and successive cutaneous two-point thresholds for vibration , 1969 .

[17]  Thomas K. Ferris,et al.  Supporting Anesthetic Monitoring through Tactile Display of Physiological Parameters , 2009 .

[18]  Lee R. BROOKSf SPATIAL AND VERBAL COMPONENTS OF THE ACT OF RECALL * , 2005 .

[19]  Karon E. MacLean,et al.  Foundations of Transparency in Tactile Information Design , 2008, IEEE Transactions on Haptics.

[20]  J C Craig,et al.  Vibrotactile pattern perception: extraordinary observers. , 1977, Science.

[21]  W. M. Rabinowitz,et al.  Information transmission with a multifinger tactual display , 1999, Perception & psychophysics.

[22]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[23]  Thomas K. Ferris,et al.  The Implications of Crossmodal Links in Attention for the Design of Multimodal Interfaces: A Driving Simulation Study , 2006 .

[24]  Laura M. Stanley Haptic and Auditory Cues for Lane Departure Warnings , 2006 .

[25]  A. Wetherell Short-Term Memory for Verbal and Graphic Route Information , 1979 .

[26]  William J. Horrey,et al.  Multiple Resource Modeling of Task Interference in Vehicle Control, Hazard Awareness and In-vehicle Task Performance , 2005 .

[27]  J G Hollands,et al.  ENGINEERING PSYCHOLOGY AND HUMAN PERFORMANCE - THIRD EDITION , 2000 .

[28]  Stephen A. Brewster,et al.  An Investigation into the Use of Tactons to Present Progress Information , 2005, INTERACT.

[29]  Stephen A. Brewster,et al.  New parameters for tacton design , 2007, CHI Extended Abstracts.

[30]  P. Evans,et al.  Vibrotactile masking: Temporal integration, persistence, and strengths of representations , 1987, Perception & psychophysics.

[31]  Hendrik A. H. C. van Veen,et al.  A Tactile Cockpit Instrument Supports the Control of Self-Motion During Spatial Disorientation , 2006, Hum. Factors.

[32]  Brian M. Kleiner,et al.  Toward Developing an Approach for Alerting Drivers to the Direction of a Crash Threat , 2007, Hum. Factors.

[33]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[34]  Charles Spence,et al.  Multisensory numerosity judgments for visual and tactile stimuli , 2007, Perception & psychophysics.

[35]  Yvonne Rogers,et al.  Enhancing Navigation Information with Tactile Output Embedded into the Steering Wheel , 2009, Pervasive.

[36]  Robert Gray,et al.  A Comparison of Tactile, Visual, and Auditory Warnings for Rear-End Collision Prevention in Simulated Driving , 2008, Hum. Factors.

[37]  Karon E. MacLean,et al.  Perceptual Design of Haptic Icons , 2003 .

[38]  Ryan M. Traylor,et al.  A Haptic Back Display for Attentional and Directional Cueing , 2003 .

[39]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[40]  Hong Z. Tan,et al.  Driver Reaction Time to Tactile and Auditory Rear-End Collision Warnings While Talking on a Cell Phone , 2009, Hum. Factors.

[41]  Christopher D. Wickens,et al.  Multiple Resources and Mental Workload , 2008, Hum. Factors.

[42]  Karon E. MacLean,et al.  Haptic Interaction Design for Everyday Interfaces , 2008 .

[43]  Karon E. MacLean,et al.  Learning and identifying haptic icons under workload , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[44]  Lynette A. Jones,et al.  Tactile Displays: Guidance for Their Design and Application , 2008, Hum. Factors.

[45]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[46]  Lorna M. Brown,et al.  Multidimensional tactons for non-visual information presentation in mobile devices , 2006, Mobile HCI.

[47]  J Mark Ansermino,et al.  A Novel Vibrotactile Display to Improve the Performance of Anesthesiologists in a Simulated Critical Incident , 2008, Anesthesia and analgesia.

[48]  Heath A. Ruff,et al.  Tactile versus Aural Redundant Alert Cues for UAV Control Applications , 2004 .

[49]  Håkan Jansson,et al.  An analysis of driver’s steering behaviour during auditory or haptic warnings for the designing of lane departure warning system , 2003 .