Towards Multimodal Affective Feedback: Interaction between Visual and Haptic Modalities
暂无分享,去创建一个
Bipin Indurkhya | Akshita | Harini Alagarai Sampath | Eunhwa Lee | Yudong Bae | B. Indurkhya | H. Sampath | Eunhwa Lee | Yudong Bae
[1] Jonathan D. Cohen,et al. Rubber hands ‘feel’ touch that eyes see , 1998, Nature.
[2] V. Jousmäki,et al. Parchment-skin illusion: sound-biased touch , 1998, Current Biology.
[3] Timothy W. Bickmore,et al. Empathic Touch by Relational Agents , 2010, IEEE Transactions on Affective Computing.
[4] P. Bertelson,et al. Multisensory integration, perception and ecological validity , 2003, Trends in Cognitive Sciences.
[5] Karon E. MacLean,et al. Designing Large Sets of Haptic Icons with Rhythm , 2008, EuroHaptics.
[6] P. Lang. International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .
[7] Mehdi Ammi,et al. Improvement of the recognition of facial expressions with haptic feedback , 2011, 2011 IEEE International Workshop on Haptic Audio Visual Environments and Games.
[8] C. Spence,et al. Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves , 2000, Psychological science.
[9] K. Scherer,et al. Spatial frequencies or emotional effects? A systematic measure of spatial frequencies for IAPS pictures by a discrete wavelet analysis , 2007, Journal of Neuroscience Methods.
[10] Stephen A. Brewster,et al. Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.
[11] J. J. Higgins,et al. The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.
[12] Linda R. Elliott,et al. Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis , 2006, ICMI '06.
[13] J. Vroomen,et al. The perception of emotions by ear and by eye , 2000 .
[14] F. Gosselin,et al. Audio-visual integration of emotion expression , 2008, Brain Research.
[15] Matthew S. Goodwin,et al. iSET: interactive social-emotional toolkit for autism spectrum disorder , 2008, IDC.
[16] Stephen A. Brewster,et al. Crossmodal icons for information display , 2006, CHI EA '06.
[17] C. Sherrick. A scale for rate of tactual vibration. , 1985, The Journal of the Acoustical Society of America.
[18] Lorna M. Brown,et al. Non-visual information display using tactons , 2004, CHI EA '04.
[19] Stephen A. Brewster,et al. New parameters for tacton design , 2007, CHI Extended Abstracts.
[20] Allan Hanbury,et al. Affective image classification using features inspired by psychology and art theory , 2010, ACM Multimedia.
[21] Mary Beth Rosson,et al. CHI '07 Extended Abstracts on Human Factors in Computing Systems , 2007, CHI 2007.
[22] Robin Jeffries,et al. CHI '06 Extended Abstracts on Human Factors in Computing Systems , 2006, CHI 2006.
[23] P. Bertelson,et al. Cross-modal bias and perceptual fusion with auditory-visual spatial discordance , 1981, Perception & psychophysics.
[24] F A GELDARD,et al. Some neglected possibilities of communication. , 1960, Science.
[25] J. Russell. A circumplex model of affect. , 1980 .
[26] Karon E. MacLean,et al. A first look at individuals' affective ratings of vibrations , 2013, 2013 World Haptics Conference (WHC).
[27] M. Bradley,et al. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.
[28] James A. Larson,et al. Guidelines for multimodal user interface design , 2004, CACM.
[29] Arthur C. Graesser,et al. AutoTutor and affective autotutor: Learning by talking with cognitively and emotionally intelligent computers that talk back , 2012, TIIS.
[30] M. Ernst,et al. Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.