Using Physiological Cues to Determine Levels of Anxiety Experienced among Deaf and Hard of Hearing English Language Learners

Deaf and hard of hearing English language learners encounter a range of challenges when learning spoken/written English, many of which are not faced by their hearing counterparts. In this paper, we examine the feasibility of utilizing physiological data, including arousal and eye gaze behaviors, as a method of identifying instances of anxiety and frustration experienced when delivering presentations. Initial findings demonstrate the potential of using this approach, which in turn could aid English language instructors who could either provide emotional support or personalized instructions to assist deaf and hard of hearing English language learners in the classroom.

[1]  Hanqing Lu,et al.  Fusing multi-modal features for gesture recognition , 2013, ICMI '13.

[2]  Janick Naveteur,et al.  Individual differences in electrodermal activity as a function of subjects' anxiety , 1987 .

[3]  Torsten Wörtwein,et al.  Multimodal Public Speaking Performance Assessment , 2015, ICMI.

[4]  Elena Di Lascio,et al.  Unobtrusive Assessment of Students' Emotional Engagement during Lectures Using Electrodermal Activity Sensors , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[5]  Elena Di Lascio,et al.  Using Unobtrusive Wearable Sensors to Measure the Physiological Synchrony Between Presenters and Audience Members , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[6]  Everlyne Kimani,et al.  Addressing Public Speaking Anxiety in Real-time Using a Virtual Public Speaking Coach and Physiological Sensors , 2019, IVA.

[7]  Rosalind W. Picard,et al.  A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal Activity , 2010, IEEE Transactions on Biomedical Engineering.

[8]  Mohammed E. Hoque,et al.  AutoManner: An Automated Interface for Making Public Speakers Aware of Their Mannerisms , 2016, IUI.

[9]  T. Bickmore,et al.  RoboCOP , 2017 .

[10]  E. Horwitz,et al.  Foreign Language Classroom Anxiety , 1986 .

[11]  Marco Gamba,et al.  BORIS: a free, versatile open‐source event‐logging software for video/audio coding and live observations , 2016 .

[12]  Alistair Sutherland,et al.  Multimodal system for public speaking with real time feedback: a positive computing perspective , 2016, ICMI.

[13]  Agata Rozga,et al.  Using electrodermal activity to recognize ease of engagement in children during social interactions , 2014, UbiComp.

[14]  M. Benedek,et al.  Decomposition of skin conductance data by means of nonnegative deconvolution , 2010, Psychophysiology.

[15]  RoboCOP: A Robotic Coach for Oral Presentations , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[16]  Andrea Kleinsmith,et al.  Distinguishing Anxiety Subtypes of English Language Learners Towards Augmented Emotional Clarity , 2020, AIED.

[17]  Mohammed E. Hoque,et al.  ROC speak: semi-automated personalized feedback on nonverbal behavior from recorded videos , 2015, UbiComp.

[18]  D. Bedoin English teachers of deaf and hard‐of‐hearing students in French schools: needs, barriers and strategies , 2011 .

[19]  Yukie Aida Examination of Horwitz, Horwitz, and Cope's Construct of Foreign Language Anxiety: The Case of Students of Japanese , 1994 .

[20]  Deborah F. Deckner,et al.  Early Interests and Joint Engagement in Typical Development, Autism, and Down Syndrome , 2010, Journal of autism and developmental disorders.

[21]  Torsten Wörtwein,et al.  A Multimodal Corpus for the Assessment of Public Speaking Ability and Anxiety , 2016, LREC.

[22]  Sedeeq Al-khazraji Using Data-Driven Approach for Modeling Timing Parameters of American Sign Language , 2018, ICMI.

[23]  T. Gregersen Dynamic properties of language anxiety , 2020 .

[24]  Stefan Scherer,et al.  Assessing Public Speaking Ability from Thin Slices of Behavior , 2017, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[25]  Gareth J. F. Jones,et al.  Effects of Good Speaking Techniques on Audience Engagement , 2015, ICMI.

[26]  J. Lagopoulos Electrodermal activity , 2007, Acta Neuropsychiatrica.

[27]  Lei Chen,et al.  Using Multimodal Cues to Analyze MLA'14 Oral Presentation Quality Corpus: Presentation Delivery and Slides Quality , 2014, MLA@ICMI.

[28]  Timothy W. Bickmore,et al.  DynamicDuo: Co-presenting with Virtual Agents , 2015, CHI.

[29]  Andrea Kleinsmith,et al.  Public Speaking Anxiety in a Real Classroom: Towards Developing a Reflection System , 2019, CHI Extended Abstracts.

[30]  Johannes Schöning,et al.  Augmenting Social Interactions: Realtime Behavioural Feedback using Social Signal Processing Techniques , 2015, CHI.

[31]  Peter van Rosmalen,et al.  Presentation Trainer, your Public Speaking Multimodal Coach , 2015, ICMI.

[32]  K. J. Blom,et al.  Public speaking anxiety decreases within repeated virtual reality training sessions , 2019, PloS one.

[33]  Richard E. Ladner,et al.  ClassInFocus: enabling improved visual attention strategies for deaf and hard of hearing students , 2009, Assets '09.

[34]  Mohammed E. Hoque,et al.  Rhema: A Real-Time In-Situ Intelligent Interface to Help People with Public Speaking , 2015, IUI.