Psycho-physiological assessment of a prosthetic hand sensory feedback system based on an auditory display: a preliminary study

BackgroundProsthetic hand users have to rely extensively on visual feedback, which seems to lead to a high conscious burden for the users, in order to manipulate their prosthetic devices. Indirect methods (electro-cutaneous, vibrotactile, auditory cues) have been used to convey information from the artificial limb to the amputee, but the usability and advantages of these feedback methods were explored mainly by looking at the performance results, not taking into account measurements of the user’s mental effort, attention, and emotions. The main objective of this study was to explore the feasibility of using psycho-physiological measurements to assess cognitive effort when manipulating a robot hand with and without the usage of a sensory substitution system based on auditory feedback, and how these psycho-physiological recordings relate to temporal and grasping performance in a static setting.Methods10 male subjects (26+/-years old), participated in this study and were asked to come for 2 consecutive days. On the first day the experiment objective, tasks, and experiment setting was explained. Then, they completed a 30 minutes guided training. On the second day each subject was tested in 3 different modalities: Auditory Feedback only control (AF), Visual Feedback only control (VF), and Audiovisual Feedback control (AVF). For each modality they were asked to perform 10 trials. At the end of each test, the subject had to answer the NASA TLX questionnaire. Also, during the test the subject’s EEG, ECG, electro-dermal activity (EDA), and respiration rate were measured.ResultsThe results show that a higher mental effort is needed when the subjects rely only on their vision, and that this effort seems to be reduced when auditory feedback is added to the human-machine interaction (multimodal feedback). Furthermore, better temporal performance and better grasping performance was obtained in the audiovisual modality.ConclusionsThe performance improvements when using auditory cues, along with vision (multimodal feedback), can be attributed to a reduced attentional demand during the task, which can be attributed to a visual “pop-out” or enhance effect. Also, the NASA TLX, the EEG’s Alpha and Beta band, and the Heart Rate could be used to further evaluate sensory feedback systems in prosthetic applications.

[1]  A. Treisman Contextual Cues in Selective Listening , 1960 .

[2]  J. Deutsch,et al.  Attention: Some theoretical considerations. , 1963 .

[3]  P. Herberts,et al.  Ideas on sensory feedback in hand prostheses , 1979, Prosthetics and orthotics international.

[4]  Frank A. Saunders,et al.  Electrocutaneous Stimulation for Sensory Communication in Rehabilitation Engineering , 1982, IEEE Transactions on Biomedical Engineering.

[5]  W. Ray,et al.  EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes. , 1985, Science.

[6]  S C Jacobsen,et al.  Extended physiologic taction: design and evaluation of a proportional force feedback system. , 1989, Journal of rehabilitation research and development.

[7]  A. Kramer,et al.  Physiological metrics of mental workload: A review of recent progress , 1990, Multiple-task performance.

[8]  W.J. Tompkins,et al.  Electrotactile and vibrotactile displays for sensory substitution systems , 1991, IEEE Transactions on Biomedical Engineering.

[9]  R. Barry,et al.  Decelerative changes in heart rate during recognition of visual stimuli: effects of psychological stress. , 1995, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[10]  A. Malliani,et al.  Heart rate variability. Standards of measurement, physiological interpretation, and clinical use , 1996 .

[11]  G. Breithardt,et al.  Heart rate variability: standards of measurement, physiological interpretation and clinical use. Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. , 1996 .

[12]  Carolyne R Swain,et al.  Electrophysiological, behavioral, and subjective indexes of workload when performing multiple tasks: manipulations of task difficulty and training. , 1999, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[13]  G. Lundborg,et al.  Hearing as substitution for sensation: a new principle for artificial sensibility. , 1999, The Journal of hand surgery.

[14]  Perry R. Cook,et al.  An auditory display system for aiding interjoint coordination , 2000 .

[15]  J. Vroomen,et al.  Sound enhances visual perception: cross-modal effects of auditory organization on vision. , 2000, Journal of experimental psychology. Human perception and performance.

[16]  E. DeYoe,et al.  A comparison of visual and auditory motion processing in human cerebral cortex. , 2000, Cerebral cortex.

[17]  M A Schier,et al.  Changes in EEG alpha power during simulated driving: a demonstration. , 2000, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[18]  Ruey-Song Huang,et al.  Selection of valid and reliable EEG features for predicting auditory and visual alertness levels. , 2001, Proceedings of the National Science Council, Republic of China. Part B, Life sciences.

[19]  K. Horch,et al.  Residual function in peripheral nerve stumps of amputees: implications for neural control of artificial limbs. , 2004, The Journal of hand surgery.

[20]  G. Calvert,et al.  Multisensory integration: methodological approaches and emerging principles in the human brain , 2004, Journal of Physiology-Paris.

[21]  Patrick Gomez,et al.  Respiratory responses during affective picture viewing , 2004, Biological Psychology.

[22]  G. Lundborg,et al.  Artificial sensibility of the hand based on cortical audiotactile interaction: A study using functional magnetic resonance imaging , 2005, Scandinavian journal of plastic and reconstructive surgery and hand surgery.

[23]  G.S. Dhillon,et al.  Direct neural sensory feedback and control of a prosthetic arm , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[24]  Dana Kulic,et al.  Anxiety detection during human-robot interaction , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Kilseop Ryu,et al.  Evaluation of mental workload with a combined measure based on physiological indices during a dual task of tracking and mental arithmetic , 2005 .

[26]  Andy Hunt,et al.  Guest Editors' Introduction: An Introduction to Interactive Sonification , 2005, IEEE Multim..

[27]  Jiping He,et al.  Recent developments in biofeedback for neuromotor rehabilitation , 2006, Journal of NeuroEngineering and Rehabilitation.

[28]  Alfred Effenberg,et al.  Movement sonification: Effects on perception and action , 2005, IEEE MultiMedia.

[29]  A. Kargov,et al.  Design and Evaluation of a Low-Cost Force Feedback System for Myoelectric Prosthetic Hands , 2006 .

[30]  G Pfurtscheller,et al.  Cardiac responses induced during thought-based control of a virtual environment. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[31]  Alejandro Hernández Arieta,et al.  An fMRI Study on the Effects of Electrical Stimulation as Biofeedback , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[32]  Dudley S. Childress,et al.  Closed-loop control in prosthetic systems: Historical perspective , 2006, Annals of Biomedical Engineering.

[33]  T. Baumgartner,et al.  From emotion perception to emotion experience: emotions evoked by pictures and classical music. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[34]  Regan L. Mandryk,et al.  Using psychophysiological techniques to measure user experience with entertainment technologies , 2006, Behav. Inf. Technol..

[35]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[36]  D. Caldwell,et al.  Task-Orientated Biofeedback System for the Rehabilitation of the Upper Limb , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[37]  D. G. Buma,et al.  Intermittent Stimulation Delays Adaptation to Electrocutaneous Sensory Feedback , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[38]  Blair A. Lock,et al.  Redirection of cutaneous sensation from the hand to the chest skin of human amputees with targeted reinnervation , 2007, Proceedings of the National Academy of Sciences.

[39]  Robin R. Murphy,et al.  Survey of Psychophysiology Measurements Applied to Human-Robot Interaction , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[40]  Nitish V. Thakor,et al.  Testing a Prosthetic Haptic Feedback Simulator With an Interactive Force Matching Task , 2008 .

[41]  Wenwei Yu,et al.  Audio aided electro-tactile perception training for finger posture biofeedback , 2008, 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[42]  Ju-Hwan Lee,et al.  Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions , 2008, BCS HCI.

[43]  P. Mamassian,et al.  Audiovisual integration of stimulus transients , 2008, Vision Research.

[44]  E. Soetens,et al.  Psychophysiological investigation of vigilance decrement: Boredom or cognitive fatigue? , 2008, Physiology & Behavior.

[45]  Takeo Watanabe,et al.  Specificity of auditory-guided visual perceptual learning suggests crossmodal plasticity in early visual cortex , 2009, Experimental Brain Research.

[46]  Christopher R. Brown,et al.  EEG differences in children between eyes-closed and eyes-open resting conditions , 2009, Clinical Neurophysiology.

[47]  Robert Höldrich,et al.  PhysioSonic - Evaluated Movement Sonification as Auditory Feedback in Physiotherapy , 2009, CMMR/ICAD.

[48]  Robert Höldrich,et al.  Physiosonic - movement sonification as auditory feedback , 2009 .

[49]  Christopher W. Robinson,et al.  Attention and cross-modal processing: Evidence from heart rate analyses , 2010 .

[50]  G. Lundborg,et al.  SmartHand tactile display: A new concept for providing sensory feedback in hand prostheses , 2010, Scandinavian journal of plastic and reconstructive surgery and hand surgery.

[51]  Stephen H. Fairclough,et al.  The Effect of an Extrinsic Incentive on Psychophysiological Measures of Mental Effort and Motivational Disposition when Task Demand is Varied , 2010 .

[52]  Charles Spence Crossmodal attention , 1998, Scholarpedia.

[53]  Cara E. Stepp,et al.  Relative to direct haptic feedback, remote vibrotactile feedback improves but slows object manipulation , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[54]  Wenwei Yu,et al.  Classification of Upper Limb Motions from Around-Shoulder Muscle Activities: Hand Biofeedback , 2010, The open medical informatics journal.

[55]  Jodi Forlizzi,et al.  Psycho-physiological measures for assessing cognitive load , 2010, UbiComp.

[56]  Keehoon Kim,et al.  On the Design of Miniature Haptic Devices for Upper Extremity Prosthetics , 2010, IEEE/ASME Transactions on Mechatronics.

[57]  Christian Cipriani,et al.  Vibrotactile sensory substitution in multi-fingered hand prostheses: Evaluation studies , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[58]  Keehoon Kim,et al.  Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. , 2011, Brain : a journal of neurology.

[59]  Christian Cipriani,et al.  A Miniature Vibrotactile Sensory Substitution Device for Multifingered Hand Prosthetics , 2012, IEEE Transactions on Biomedical Engineering.