Immersive Sonification for Displaying Brain Scan Data

Scans of brains result in data that can be challenging to display due to its complexity, multi-dimensionality, and range. Visual representations of such data are limited due to the nature of the display, the number of possible dimensions that can be represented visually, and the capacity of our visual system to perceive and interpret visual data. This paper describes the use of sonification to interpret brain scans and use sound as a complementary tool to view, analyze, and diagnose. The sonification tool SoniScan is described and evaluated as a method to augment visual brain data display.

[1]  Armando Barreto,et al.  Impact of spatial auditory feedback on the efficiency of iconic human-computer interfaces under conditions of visual impairment , 2007, Comput. Hum. Behav..

[2]  Gerold Baier,et al.  Supplementary Material for "Multi-Channel Sonification of Human EEG" , 2007 .

[3]  Andy Hunt,et al.  Segmentation of Biological Cell Images for Sonification , 2008, 2008 Congress on Image and Signal Processing.

[4]  R. Robinson,et al.  Visual and semiquantitative analysis of cortical FDG-PET scans in childhood epileptic encephalopathies. , 1997, Journal of nuclear medicine : official publication, Society of Nuclear Medicine.

[5]  Sandra Pauletto,et al.  The sonification of EMG data , 2006 .

[6]  Leon Glass,et al.  Heart Rate Sonification: A New Approach to Medical Diagnosis , 2004, Leonardo.

[7]  Bruce N Walker,et al.  Magnitude estimation of conceptual data dimensions for use in sonification. , 2002, Journal of experimental psychology. Applied.

[8]  Karl M Newell,et al.  Entropy compensation in human motor adaptation. , 2008, Chaos.

[9]  R. Koeppe,et al.  A diagnostic approach in Alzheimer's disease using three-dimensional stereotactic surface projections of fluorine-18-FDG PET. , 1995, Journal of nuclear medicine : official publication, Society of Nuclear Medicine.

[10]  Christian Müller-Tomfelde Interaction Sound Feedback in a Haptic Virtual Environment to Improve Motor Skill Acquisition , 2004, ICAD.

[11]  Roberta L. Klatzky,et al.  Evaluation of spatial displays for navigation without sight , 2006, TAP.

[12]  Emil Jovanov,et al.  Acoustic rendering as support for sustained attention during biomedical procedures , 1998 .

[13]  Gregory Kramer,et al.  Auditory Display: Sonification, Audification, And Auditory Interfaces , 1994 .

[14]  Marc H. Brown An introduction to Zeus: audiovisualization of some elementary sequential and parallel sorting algorithms , 1992, CHI '92.

[15]  Tomasz Letowski,et al.  Spatial auditory displays for use within attack rotary wing aircraft , 2000 .

[16]  T. Mikkelsen,et al.  Prediction of Glioblastoma Multiform Response to Bevacizumab Treatment Using Multi-Parametric MRI , 2012, PloS one.

[17]  Lisa M. Mauney,et al.  Individual Differences and the Field of Auditory Display: Past Research, A Present Study, and an Agenda for the Future , 2007 .

[18]  L. Rayleigh,et al.  XII. On our perception of sound direction , 1907 .

[19]  Helge Ritter,et al.  SONIFICATIONS FOR EEG DATA ANALYSIS , 2002 .

[20]  K. Ishii,et al.  Fully automatic differential diagnosis system for dementia with Lewy bodies and Alzheimer’s disease using FDG-PET and 3D-SSP , 2007, European Journal of Nuclear Medicine and Molecular Imaging.

[21]  Davide Rocchesso,et al.  The Sonification Handbook , 2011 .

[22]  J Deasy,et al.  SU-E-T-259: A Statistical and Machine Learning-Based Tool for Modeling and Visualization of Radiotherapy Treatment Outcomes. , 2012, Medical physics.

[23]  Todd Ingalls,et al.  Real-Time Sonification of Movement for an Immersive Stroke Rehabilitation Environment , 2009 .