Audio Focus: Interactive spatial sound coupled with haptics to improve sound source location in poor visibility

Abstract In an effort to simplify human resource management and reduce costs, control towers are now more and more designed to not be implanted directly on the airport but remotely. This concept, known as Remote Control Tower, offers a “digital” working context because the view on the runways is broadcast remotely via cameras, which are located on the physical airport. This offers researchers and engineers the possibility to develop novel interaction techniques. But this technology relies on the sense of sight, which is largely used to give the operator information and interaction, and which is now becoming overloaded. In this paper, we focus on the design and the testing of new interaction forms that rely on the human senses of hearing and touch. More precisely, our study aims at quantifying the contribution of a multimodal interaction technique based on spatial sound and vibrotactile feedback to improve aircraft location. Applied to Remote Tower environment, the final purpose is to enhance Air Traffic Controller's perception and increase safety. Three different interaction modalities have been compared by involving 22 Air Traffic Controllers in a simulated environment. The experimental task consisted in locating aircraft in different airspace positions by using the senses of hearing and touch through two visibility conditions. In the first modality (spatial sound only), the sound sources (e.g. aircraft) had the same amplification factor. In the second modality (called Audio Focus), the amplification factor of the sound sources located along the participant's head sagittal axis was increased, while the intensity of the sound sources located outside this axis was decreased. In the last modality, Audio Focus was coupled with vibrotactile feedback to indicate in addition the vertical positions of aircraft. Behavioral (i.e. accuracy and response times measurements) and subjective (i.e. questionnaires) results showed significantly higher performance in poor visibility when using Audio Focus interaction. In particular, interactive spatial sound gave the participants notably higher accuracy in degraded visibility compared to spatial sound only. This result was even better when coupled with vibrotactile feedback. Meanwhile, response times were significantly longer when using Audio Focus modality (coupled with vibrotactile feedback or not), while remaining acceptably short. This study can be seen as the initial step in the development of a novel interaction technique that uses sound as a means of location when the sense of sight alone is not enough.

[1]  Christophe Hurter,et al.  Strip'TIC: exploring augmented paper strips for air traffic controllers , 2012, AVI.

[2]  Anne Papenfuss,et al.  Head up only — A design concept to enable multiple remote tower operations , 2016, 2016 IEEE/AIAA 35th Digital Avionics Systems Conference (DASC).

[3]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[4]  Jeffrey S. Shell,et al.  AuraMirror: artistically visualizing attention , 2003, CHI Extended Abstracts.

[5]  Gregory H. Wakefield,et al.  Introduction to Head-Related Transfer Functions (HRTFs): Representations of HRTFs in Time, Frequency, and Space , 2001 .

[6]  Matthew J. Jensen,et al.  A Customizable Automotive Steering System With a Haptic Feedback Control Strategy for Obstacle Avoidance Notification , 2011, IEEE Transactions on Vehicular Technology.

[7]  Fabio Babiloni,et al.  Human-Machine Interaction Assessment by Neurophysiological Measures: A Study on Professional Air Traffic Controllers , 2018, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[8]  Rob Gray,et al.  A Comparison of Different Informative Vibrotactile Forward Collision Warnings: Does the Warning Need to Be Linked to the Collision Event? , 2014, PloS one.

[9]  Vilas Nene Remote Tower Research in the United States , 2016 .

[10]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[11]  Alois Sontacchi,et al.  Application of a transaural focused sound reproduction , 2009 .

[12]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[13]  Sharon L. Oviatt,et al.  Mutual disambiguation of recognition errors in a multimodel architecture , 1999, CHI '99.

[14]  Norbert Fürstenau,et al.  Steps towards the Virtual Tower: Remote Airport Traffic Control Center (RAiCe) , 2009 .

[15]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[16]  Alexander H. Waibel,et al.  Estimating focus of attention based on gaze and sound , 2001, PUI '01.

[17]  F L Wightman,et al.  Localization using nonindividualized head-related transfer functions. , 1993, The Journal of the Acoustical Society of America.

[18]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[19]  A. Mills On the minimum audible angle , 1958 .

[20]  Hoon Kim,et al.  Monte Carlo Statistical Methods , 2000, Technometrics.

[21]  Svein Braathen,et al.  Air Transport Services in Remote Regions , 2011 .

[22]  J C F de Winter,et al.  Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat. , 2017, Accident; analysis and prevention.

[23]  Nadine B. Sarter,et al.  Good Vibrations: Tactile Feedback in Support of Attention Allocation and Human-Automation Coordination in Event-Driven Domains , 1999, Hum. Factors.

[24]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[25]  Frederick P. Brooks,et al.  Project GROPEHaptic displays for scientific visualization , 1990, SIGGRAPH.

[26]  Davide Rocchesso,et al.  The Sonification Handbook , 2011 .

[27]  F. J. Van Schaik,et al.  Assessment of visual cues by tower controllers, with implications for a remote tower control centre , 2010, IFAC HMS.

[28]  Fanxing Meng,et al.  Dynamic Vibrotactile Signals for Forward Collision Avoidance Warning Systems , 2015, Hum. Factors.

[29]  J.B.F. van Erp,et al.  Vibrotactile waypoint navigation at sea and in the air : two case studies , 2004 .

[30]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[31]  Federico Fontana,et al.  An exploration on whole-body and foot-based vibrotactile sensitivity to melodic consonance , 2016 .

[32]  Christophe Hurter,et al.  Immersive solutions for future Air Traffic Control and Management , 2016, ISS Companion.

[33]  Clara Suied,et al.  The Spatial Release of Cognitive Load in Cocktail Party Is Determined by the Relative Levels of the Talkers , 2017, Journal of the Association for Research in Otolaryngology.

[34]  Philip R. Cohen,et al.  Tangible multimodal interfaces for safety-critical applications , 2004, CACM.

[35]  Anil K. Raj,et al.  Vibrotactile Displays for Improving Spatial Awareness , 2000 .

[36]  R. Bowen Loftin,et al.  Multisensory perception: beyond the visual in visualization , 2003, Comput. Sci. Eng..

[37]  C. Mélan,et al.  Recall Performance in Air Traffic Controllers Across the 24-hr Day: Influence of Alertness and Task Demands on Recall Strategies , 2012 .

[38]  Roel Vertegaal,et al.  OverHear: augmenting attention in remote social gatherings through computer-mediated hearing , 2005, CHI EA '05.

[39]  Charles Lenay,et al.  FeelTact: rich tactile feedback for mobile gaming , 2011, Advances in Computer Entertainment Technology.

[40]  Sriram Subramanian,et al.  Perception of ultrasonic haptic feedback on the hand: localisation and apparent motion , 2014, CHI.

[41]  Gerhard Weber,et al.  Using Spatial Audio for the Enhanced Presentation of Synthesized Speech within Screen-Readers for Blind Computer Users , 1994, ICCHP.

[42]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[43]  Douglas S. Brungart,et al.  Auditory localization of nearby sources in a virtual audio display , 2001, Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No.01TH8575).

[44]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[45]  Anne Papenfuss,et al.  ATC-Monitoring When One Controller Operates Two Airports , 2011 .

[46]  A John Van Opstal,et al.  The influence of duration and level on human sound localization. , 2004, The Journal of the Acoustical Society of America.

[47]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[48]  Constantine Stephanidis,et al.  A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction , 1996, Assets '96.

[49]  Thomas Maier,et al.  Driver Support by a Vibrotactile Seat Matrix – Recognition, Adequacy and Workload of Tactile Patterns in Take-over Scenarios During Automated Driving☆ , 2015 .