Language proficiency modulates listeners’ selective attention to a talker’s mouth: A conceptual replication of Birulés et al. (2020)

Abstract This study presents a conceptual replication of Birulés et al.’s (2020, Experiment 2) investigation of native and nonnative listeners’ selective attention to a talker’s mouth with the goal of better understanding the potentially modulating role of proficiency in listeners’ reliance on audiovisual speech cues. Listeners’ eye gaze was recorded while watching two short videos. Findings from one of the videos replicated results from the original study, showing greater attention to the talker’s mouth among L2 than L1 listeners. In both videos, L2 proficiency modulated attention, with more fixations on the mouth among lower proficiency listeners, an effect predicted but not observed in the original study. Collectively, these laboratory-based findings highlight the role of visual speech cues in L2 listening and present evidence that listeners with more limited proficiency may be especially reliant on such cues. These observations warrant future investigations of the benefits of visual speech cues in instructional and assessment contexts.

[1]  Julia Schwarz,et al.  Semantic Cues Modulate Children’s and Adults’ Processing of Audio-Visual Face Mask Speech , 2022, Frontiers in psychology.

[2]  Ruslan Suvorov,et al.  VISUALS IN THE ASSESSMENT AND TESTING OF SECOND LANGUAGE LISTENING: A METHODOLOGICAL SYNTHESIS , 2021, International Journal of Listening.

[3]  D. Lewkowicz,et al.  Highly proficient L2 speakers still need to attend to a talker’s mouth when processing L2 speech , 2020 .

[4]  Linda Drijvers,et al.  Degree of Language Experience Modulates Visual Attention to Visible Speech and Iconic Gestures During Clear and Degraded Speech Comprehension , 2019, Cogn. Sci..

[5]  D. Poulin-Dubois,et al.  Selective attention to the mouth of talking faces in monolinguals and bilinguals aged 5 months to 5 years. , 2019, Developmental psychology.

[6]  Emma Marsden,et al.  Replication in Second Language Research: Narrative and Systematic Reviews and Recommendations for the Field. , 2018 .

[7]  M. Król Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study , 2018, PloS one.

[8]  Luke Plonsky,et al.  MULTIPLE REGRESSION AS A FLEXIBLE ALTERNATIVE TO ANOVA IN L2 RESEARCH , 2016, Studies in Second Language Acquisition.

[9]  K. Sekiyama,et al.  Language/Culture Modulates Brain and Gaze Processes in Audiovisual Speech Perception , 2016, Scientific Reports.

[10]  David J. Lewkowicz,et al.  Language familiarity modulates relative attention to the eyes and mouth of a talker , 2016, Cognition.

[11]  Todd Ruecker,et al.  White Native English Speakers Needed: The Rhetorical Construction of Privilege in Online Teacher Recruitment Spaces , 2015 .

[12]  Kevin B. McGowan Social Expectation Improves Speech Perception in Noise , 2015, Language and speech.

[13]  L. Taylor,et al.  Examining Listening: Research And Practice In Assessing Second Language Listening , 2013 .

[14]  S. Papageorgiou,et al.  The Relative Difficulty of Dialogic and Monologic Input in a Second-Language Listening Comprehension Test , 2012 .

[15]  J. Hulstijn The construct of language proficiency in the study of bilingualism from a cognitive perspective* , 2012, Bilingualism: Language and Cognition.

[16]  D. Lewkowicz,et al.  Infants deploy selective attention to the mouth of a talking face when learning speech , 2012, Proceedings of the National Academy of Sciences.

[17]  Kristin Lemhöfer,et al.  Introducing LexTALE: A quick and valid Lexical Test for Advanced Learners of English , 2011, Behavior research methods.

[18]  Anne Cutler,et al.  Non-native speech perception in adverse conditions: A review , 2010, Speech Commun..

[19]  Rachael E. Jack,et al.  Cultural Confusions Show that Facial Expressions Are Not Universal , 2009, Current Biology.

[20]  D. Burnham,et al.  Impact of language on development of auditory-visual speech perception. , 2008, Developmental science.

[21]  Suresh Canagarajah Lingua Franca English, Multilingual Communities, and Language Acquisition , 2007 .

[22]  Margarita Kaushanskaya,et al.  The Language Experience and Proficiency Questionnaire (LEAP-Q): assessing language profiles in bilinguals and multilinguals. , 2007, Journal of speech, language, and hearing research : JSLHR.

[23]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[24]  D. Hardison,et al.  The Role of Gestures and Facial Cues in Second Language Listening Comprehension , 2005 .

[25]  M. Maclagan,et al.  Speaking rates of American and New Zealand varieties of English , 2004 .

[26]  D. Hardison Acquisition of second-language speech: Effects of visual cues, context, and talker variability , 2003, Applied Psycholinguistics.

[27]  M. Maclagan,et al.  Speaking rates of American and New Zealand varieties of English. , 2001, Clinical linguistics & phonetics.

[28]  P. Marler,et al.  Communication Goes Multimodal , 1999, Science.

[29]  E. Vatikiotis-Bateson,et al.  Eye movement of perceivers during audiovisualspeech perception , 1998, Perception & psychophysics.

[30]  Debra M. Hardison,et al.  Bimodal Speech Perception by Native and Nonnative Speakers of English: Factors Influencing the McGurk Effect , 1996 .

[31]  Roger Griffiths,et al.  Speech Rate and Listening Comprehension: Further Evidence of the Relationship , 1992 .

[32]  Y. Tohkura,et al.  McGurk effect in non-English listeners: few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility. , 1991, The Journal of the Acoustical Society of America.

[33]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[34]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[35]  H Egeth,et al.  Selective Attention , 2019, Encyclopedia of Personality and Individual Differences.

[36]  Ken W. Grant,et al.  Toward a Model of Auditory-Visual Speech Intelligibility , 2019, Multisensory Processes.

[37]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .