An exploratory study of multimodal interaction modeling based on neural computation

Multimodal interaction serves an important role in human-computer interaction. In this paper we propose a multimodal interaction model based on the latest cognitive research findings. The proposed model combines two proven neural computations, and helps to reveal the enhancement or depression influence of multimodal presentation upon the corresponding interaction task performance. A set of experiments is designed and conducted within the constraints of the model, which demonstrates the observed performance enhancement and depression effects. Our exploration and the experimental results help to further solve the question about how tactile feedback signal contribute the multimodal interaction efficiency which could provide guidelines for designing the tactile feedback in multimodal interaction.摘要创新点多通道交互在人机交互中具有重要的作用。 本文基于最新的认知研究成果, 提出了一种多通道交互模型。 该模型把两个已被证明的神经计算相结合, 用于揭示不同的多通道呈现对相应的交互任务绩效所产生的增强或抑制效果。 本文在该模型的适用约束下设计并实现了一组实验, 实验得出观察到的绩效增强效应和观察到的绩效抑制效应。 本文的探索思路和实验结果有助于进一步解决触觉反馈信息对多通道交互效率的贡献问题, 从而为多通道交互中触觉反馈的设计提供指导。

[1]  J. Vroomen,et al.  Perception of intersensory synchrony: A tutorial review , 2010, Attention, perception & psychophysics.

[2]  Braden J. McGrath,et al.  Tactile Situation Awareness System Flight Demonstration , 2004 .

[3]  B. Stein,et al.  Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.

[4]  B. Stein,et al.  Spatial determinants of multisensory integration in cat superior colliculus neurons. , 1996, Journal of neurophysiology.

[5]  M. Eimer,et al.  Tactile enhancement of auditory detection and perceived loudness , 2007, Brain Research.

[6]  M. Carandini,et al.  Normalization as a canonical neural computation , 2013, Nature Reviews Neuroscience.

[7]  G. Baranek Efficacy of Sensory and Motor Interventions for Children with Autism , 2002, Journal of autism and developmental disorders.

[8]  S. Morad,et al.  Ceramide-orchestrated signalling in cancer cells , 2012, Nature Reviews Cancer.

[9]  Gavriel Salvendy,et al.  Handbook of Human Factors and Ergonomics: Salvendy/Handbook of Human Factors 4e , 2012 .

[10]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[11]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[12]  G. DeAngelis,et al.  A Normalization Model of Multisensory Integration , 2011, Nature Neuroscience.

[13]  Hong Z. Tan,et al.  The Body Surface as a Communication System: The State of the Art after 50 Years , 2007, PRESENCE: Teleoperators and Virtual Environments.

[14]  C. Spence,et al.  The Handbook of Multisensory Processing , 2004 .

[15]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[16]  D. Heeger Normalization of cell responses in cat striate cortex , 1992, Visual Neuroscience.

[17]  Gavriel Salvendy,et al.  Handbook of Human Factors and Ergonomics , 2005 .

[18]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[19]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[20]  T. Stanford,et al.  The neural basis of multisensory integration in the midbrain: Its organization and maturation , 2009, Hearing Research.

[21]  B. Stein,et al.  Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors , 1987, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[22]  Hongan Wang,et al.  Interactive multi-scale structures for summarizing video content , 2013, Science China Information Sciences.

[23]  Richard E. Mayer,et al.  Multimedia Learning , 2001, Visible Learning Guide to Student Achievement.

[24]  B. Schneirdeman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[25]  C. Spence,et al.  Crossmodal Space and Crossmodal Attention , 2004 .

[26]  Donna Lloyd,et al.  In Touch with the Future: The Sense of Touch from Cognitive Neuroscience to Virtual Reality , 2014, PRESENCE: Teleoperators and Virtual Environments.

[27]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[28]  Colin Ware,et al.  Information Visualization: Perception for Design , 2000 .

[29]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[30]  John J. Foxe,et al.  Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. , 2004, Cerebral cortex.

[31]  Alfred Bork,et al.  Multimedia in Learning , 2001 .

[32]  Konrad Paul Kording,et al.  Bayesian integration in sensorimotor learning , 2004, Nature.

[33]  B. Stein,et al.  The Merging of the Senses , 1993 .

[34]  R. Shankar,et al.  Borderline personality disorder and sensory processing impairment , 2009 .

[35]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[36]  David E. Kieras,et al.  An Overview of the EPIC Architecture for Cognition and Performance With Application to Human-Computer Interaction , 1997, Hum. Comput. Interact..

[37]  M. Carandini From circuits to behavior: a bridge too far? , 2012, Nature Neuroscience.