Read my points: Effect of animation type when speech-reading from EMA data
暂无分享,去创建一个
[1] William F. Katz,et al. The effects of EMA-based augmented visual feedback on the English speakers' acquisition of the Japanese flap: a perceptual study , 2010, Interspeech.
[2] D. Bates,et al. Fitting Linear Mixed-Effects Models Using lme4 , 2014, 1406.5823.
[3] Slim Ouni,et al. Visual Contribution to Speech Perception: Measuring the Intelligibility of Animated Talking Heads , 2007, EURASIP J. Audio Speech Music. Process..
[4] Takayuki Ito,et al. An electromagnetic articulography-based articulatory feedback approach to facilitate second language speech production learning , 2013 .
[5] Mark K. Tiede,et al. Comparing L1 and L2 speakers using articulography , 2015, ICPhS.
[6] Philip Leifeld,et al. texreg: Conversion of Statistical Model Output in R to LaTeX and HTML Tables , 2013 .
[7] Sonya Mehta,et al. Visual Feedback of Tongue Movement for Novel Speech Sound Learning , 2015, Front. Hum. Neurosci..
[8] Jun Wang,et al. Opti-speech: a real-time, 3d visual feedback system for speech training , 2014, INTERSPEECH.
[9] Gérard Bailly,et al. Can you 'read' tongue movements? Evaluation of the contribution of tongue display to speech understanding , 2007, Speech Commun..
[10] M. McNeil,et al. Treating apraxia of speech (AOS) with EMA-supplied visual augmented feedback , 2010 .
[11] Slim Ouni,et al. VisArtico: a visualization tool for articulatory data , 2012, INTERSPEECH.
[12] Slim Ouni,et al. Tongue Gestures Awareness and Pronunciation Training , 2011, INTERSPEECH.
[13] Kristy James. Watch your tongue and read my lips , 2016 .