Psychoacoustic Sonification as User Interface for Human-Machine Interaction

When operating a machine, the operator needs to know some spatial relations, like the relative location of the target or the nearest obstacle. Often, sensors are used to derive this spatial information, and visual displays are deployed as interfaces to communicate this information to the operator. In this paper, we present psychoacoustic sonification as an alternative interface for human-machine interaction. Instead of visualizations, an interactive sound guides the operator to the desired target location, or helps her avoid obstacles in space. By considering psychoacoustics --- i.e., the relationship between the physical and the perceptual attributes of sound --- in the audio signal processing, we can communicate precisely and unambiguously interpretable direction and distance cues along three orthogonal axes to a user. We present exemplary use cases from various application areas where users can benefit from psychoacoustic sonification.

[1]  Holger Schultheis,et al.  Three Orthogonal Dimensions for Psychoacoustic Sonification , 2019, Proceedings of the 25th International Conference on Auditory Display (ICAD 2019).

[2]  Marcus Watson,et al.  Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing , 2004, Hum. Factors.

[3]  Karlheinz Blankenbach What is a Display? An Introduction to Visual Displays and Display Systems , 2015 .

[4]  Holger Schultheis,et al.  Psychoacoustical Interactive Sonification for Short Range Navigation , 2018, Acta Acustica united with Acustica.

[5]  Philip Kortum,et al.  HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces , 2008 .

[6]  A. Bindra,et al.  Temperature management under general anesthesia: Compulsion or option , 2017, Journal of anaesthesiology, clinical pharmacology.

[7]  David Black,et al.  Psychoacoustic sonification design for navigation in surgical interventions , 2017 .

[8]  Brian D. Simpson,et al.  A Comparison of Head-Tracked and Vehicle-Tracked Virtual Audio Cues in an Aircraft Navigation Taask , 2007 .

[9]  Gerald Cook,et al.  Mobile Robots: Navigation, Control and Remote Sensing , 2011 .

[10]  Jim Euchner Design , 2014, Catalysis from A to Z.

[11]  Junku Yuh,et al.  Underwater robotics , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[12]  Mathew H. Evans,et al.  Sensory Augmentation with Distal Touch: The Tactile Helmet Project , 2013, Living Machines.

[13]  Holger Schultheis,et al.  A Psychoacoustic Auditory Display for Navigation , 2018, Proceedings of the 24th International Conference on Auditory Display - ICAD 2018.

[14]  Seiji Yamada,et al.  Can Monaural Auditory Displays Convey Directional Information to Users? , 2016, CogSci.

[15]  Penelope Sanderson,et al.  Sonification design for complex work domains: dimensions and distractors. , 2009, Journal of experimental psychology. Applied.

[16]  Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles , 2022 .

[17]  Judy Edworthy,et al.  Medical audible alarms: a review , 2013, J. Am. Medical Informatics Assoc..

[18]  Stephen Brewster,et al.  Nonspeech auditory output , 2002 .

[19]  Holger Schultheis,et al.  Spatial cognition in minimally invasive surgery: a systematic review , 2018, BMC Surgery.

[20]  Oussama Khatib,et al.  Springer Handbook of Robotics , 2007, Springer Handbooks.

[21]  P. Groves Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems, Second Edition , 2013 .

[22]  Yi Yu,et al.  Using Psychoacoustic Models for Sound Analysis in Music , 2016, FIRE.

[23]  Christopher Frauenberger,et al.  Auditory Interfaces , 2022 .

[24]  Ron Kikinis,et al.  A Survey of auditory display in image-guided interventions , 2017, International Journal of Computer Assisted Radiology and Surgery.

[25]  Davide Rocchesso,et al.  The Sonification Handbook , 2011 .

[26]  H. Bruyninckx Mobile robots ∗ † , 2005 .

[27]  Tim Ziemer,et al.  A Psychoacoustic Sound Design for Pulse Oximetry , 2019, Proceedings of the 25th International Conference on Auditory Display (ICAD 2019).

[28]  David Black,et al.  Psychoacoustic sonification for tracked medical instrument guidance , 2017 .

[29]  Joerg Krueger Human-Machine Collaboration , 2018 .

[30]  Tim Ziemer Psychoacoustic Sound Field Synthesis , 2020 .

[31]  Masatoshi Ishikawa,et al.  Augmenting spatial awareness with Haptic Radar , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[32]  Thor I. Fossen,et al.  Underwater Robotics , 2008, Springer Handbook of Robotics.

[33]  Luis De Florez,et al.  True Blind Flight , 1936 .

[34]  R. Morris,et al.  Response Times to Visual and Auditory Alarms during Anaesthesia , 1996, Anaesthesia and intensive care.

[35]  Holger Schultheis,et al.  Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks , 2018, Journal on Multimodal User Interfaces.

[36]  Holger Schultheis,et al.  Three Orthogonal Dimensions for Psychoacoustic Sonification , 2019, ArXiv.

[37]  Tapio Lokki,et al.  Navigation with auditory cues in a virtual environment , 2005, IEEE MultiMedia.

[38]  Brian D. Simpson,et al.  DESIGN, VALIDATION, AND IN-FLIGHT EVALUATION OF AN AUDITORY ATTITUDE INDICATOR BASED ON PILOT-SELECTED MUSIC , 2008 .