Can visualization of internal articulators support speech perception?
暂无分享,去创建一个
[1] Olov Engwall,et al. Combining MRI, EMA and EPG measurements in a three-dimensional tongue model , 2003, Speech Commun..
[2] Jonas Beskow,et al. Animation of talking agents , 1997, AVSP.
[3] Michael M. Cohen,et al. Modeling Coarticulation in Synthetic Visual Speech , 1993 .
[4] C. Quenin. The Cued Speech Resource Book for Parents of Deaf Children , 1994 .
[5] Björn Granström,et al. Synthetic faces as a lipreading support , 1998, ICSLP.
[6] Olle Bälter,et al. Designing the user interface of the computer-based speech training system ARTUR based on early user tests , 2006, Behav. Inf. Technol..
[7] Sascha Fagel,et al. Visual information and redundancy conveyed by internal articulator dynamics in synthetic audiovisual speech , 2007, INTERSPEECH.
[8] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[9] A. Macleod,et al. A procedure for measuring auditory and audio-visual speech-reception thresholds for sentences in noise: rationale, evaluation, and recommendations for use. , 1990, British journal of audiology.
[10] Gérard Bailly,et al. Can you 'read' tongue movements? Evaluation of the contribution of tongue display to speech understanding , 2007, Speech Commun..
[11] Jonas Beskow,et al. Evaluation of a multilingual synthetic talking face as a communication aid for the hearing impaired , 2002 .