Effects of augmentative visual training on audio-motor mapping.

The purpose of this study was to determine the effect of augmentative visual feedback training on auditory-motor performance. Thirty-two healthy young participants used facial surface electromyography (sEMG) to control a human-machine interface (HMI) for which the output was vowel synthesis. An auditory-only (AO) group (n=16) trained with auditory feedback alone and an auditory-visual (AV) group (n=16) trained with auditory feedback and progressively-removed visual feedback. Subjects participated in three training sessions and one testing session over 3days. During the testing session they were given novel targets to test auditory-motor generalization. We hypothesized that the auditory-visual group would perform better on the novel set of targets than the group that trained with auditory feedback only. Analysis of variance on the percentage of total targets reached indicated a significant interaction between group and session: individuals in the AV group performed significantly better than those in the AO group during early training sessions (while using visual feedback), but no difference was seen between the two groups during later sessions. Results suggest that augmentative visual feedback during training does not improve auditory-motor performance.

[1]  内藤 玄造,et al.  音の空間情報を用いた Auditory Brain-Computer Interface の基礎的検討 , 2012 .

[2]  Robert A Jacobs,et al.  Bayesian integration of visual and auditory signals for spatial localization. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[3]  Thilo Hinterberger,et al.  An Auditory Brain-Computer Interface Based on the Self-Regulation of Slow Cortical Potentials , 2005, Neurorehabilitation and neural repair.

[4]  Perry R. Cook,et al.  The Synthesis ToolKit (STK) , 1999, ICMC.

[5]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[6]  L Proteau,et al.  A Sensorimotor Basis for Motor Learning: Evidence Indicating Specificity of Practice , 1992, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[7]  Ilana B. Witten,et al.  Why Seeing Is Believing: Merging Auditory and Visual Worlds , 2005, Neuron.

[8]  Cara E Stepp,et al.  Surface electromyography for speech and swallowing systems: measurement, analysis, and interpretation. , 2012, Journal of speech, language, and hearing research : JSLHR.

[9]  Thorsten O. Zander,et al.  Enhancing Human-Computer Interaction with Input from Active and Passive Brain-Computer Interfaces , 2010, Brain-Computer Interfaces.

[10]  D. Reinkensmeyer,et al.  Substituting auditory for visual feedback to adapt to altered dynamic and kinematic environments during reaching , 2012, Experimental Brain Research.

[11]  D. Lawrence,et al.  The transfer of a discrimination along a continuum. , 1952, Journal of comparative and physiological psychology.

[12]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[13]  E. Donchin,et al.  A P300-based brain–computer interface: Initial tests by ALS patients , 2006, Clinical Neurophysiology.

[14]  J. Henriksson Human movement science , 2012, Acta physiologica.

[15]  H. Flor,et al.  A multimodal brain-based feedback and communication system , 2004, Experimental Brain Research.

[16]  L Tremblay,et al.  Specificity of practice: the case of powerlifting. , 1998, Research quarterly for exercise and sport.

[17]  Febo Cincotti,et al.  Vibrotactile Feedback for Brain-Computer Interface Operation , 2007, Comput. Intell. Neurosci..

[18]  D. McFarland,et al.  An auditory brain–computer interface (BCI) , 2008, Journal of Neuroscience Methods.

[19]  James E. Driskell,et al.  Games, Motivation, and Learning: A Research and Practice Model , 2002 .

[20]  F. Guenther,et al.  A Wireless Brain-Machine Interface for Real-Time Speech Synthesis , 2009, PloS one.

[21]  Cara E. Stepp,et al.  Contextual effects on robotic experiments of sensory feedback for object manipulation , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[22]  Toshihisa Tanaka,et al.  EEG auditory steady state responses classification for the novel BCI , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[23]  Alberto Prieto,et al.  An auditory brain–computer interface evoked by natural speech , 2012, Journal of neural engineering.

[24]  C. Stepp,et al.  Categorical Vowel Perception Enhances the Effectiveness and Generalization of Auditory Feedback in Human-Machine-Interfaces , 2013, PloS one.

[25]  Michael A. Khan,et al.  The Effect of Practice on Component Submovements is Dependent on the Availability of Visual Feedback , 2000, Journal of motor behavior.

[26]  L.J. Trejo,et al.  Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[27]  C. Neuper,et al.  Toward a high-throughput auditory P300-based brain–computer interface , 2009, Clinical Neurophysiology.

[28]  D Elliott,et al.  Examining the Specificity of Practice Hypothesis: Is Learning Modality Specific? , 2001, Research quarterly for exercise and sport.