Exploring the Effects of Audience Visibility on Presenters and Attendees in Online Educational Presentations

Video conferencing is widely used to help deliver educational presentations, such as lectures or informational webinars, to a distributed audience. While individuals in a dyadic conversation may be able to use webcam streams to assess the engagement level of their interlocutor with some ease, as the size of the audience in a video conference setting increases, it becomes increasingly difficult to interpret how engaged the overall group may be. In this work, we use a mixed-methods approach to understand how presenters and attendees of online presentations use available cues to perceive and interpret audience behavior (such as how engaged the group is). Our results suggest that while webcams are seen as useful by presenters to increase audience visibility and encourage attention, audience members do not uniformly benefit from seeing others' webcams; other interface cues such as chat may be more useful and informative engagement indicators for both parties. We conclude with design recommendations for future systems to improve what is sensed and presented.

[1]  Jennifer Marlow,et al.  Taking Notes or Playing Games?: Understanding Multitasking in Video Communication , 2016, CSCW.

[2]  Feng Liu,et al.  Gaze-based Notetaking for Learning from Lecture Videos , 2016, CHI.

[3]  René F. Kizilcec,et al.  Showing face in video instruction: effects on information retention, visual attention, and affect , 2014, CHI.

[4]  Brian Dorn,et al.  Piloting TrACE: Exploring Spatiotemporal Anchored Collaboration in Asynchronous Learning , 2015, CSCW.

[5]  John C. Tang,et al.  Focusing on shared experiences: moving beyond the camera in video communication , 2012, DIS '12.

[6]  Kelly A. Lyons,et al.  Paying Attention in Meetings: Multitasking in Virtual Worlds , 2010 .

[7]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[8]  James W. Pennebaker,et al.  Improving teamwork using real-time language feedback , 2013, CHI.

[9]  E. Van den Bussche,et al.  Is mental effort exertion contagious? , 2016, Psychonomic bulletin & review.

[10]  Thomas S. Huang,et al.  A system for monitoring the engagement of remote online students using eye gaze estimation , 2014, 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[11]  Lisa Kleinman,et al.  Physically present, mentally absent? Technology multitasking in organizational meetings , 2010 .

[12]  Jeremy P. Birnholtz,et al.  Visualizing real-time language-based feedback on teamwork behavior in computer-mediated groups , 2009, CHI.

[13]  Kaisa Väänänen,et al.  Social Displays on Mobile Devices: Increasing Collocated People's Awareness of the User's Activities , 2015, MobileHCI.

[14]  Elena L. Glassman,et al.  Mudslide: A Spatially Anchored Census of Student Confusion for Online Lecture Videos , 2015, CHI.

[15]  Christopher Edwards,et al.  The effects of filtered video on awareness and privacy , 2000, CSCW '00.

[16]  Faria Sana,et al.  Laptop multitasking hinders classroom learning for both users and nearby peers , 2013, Comput. Educ..

[17]  Johan Lundin,et al.  Global online meetings in virtual teams: from media choice to interaction negotiation , 2011, C&T.

[18]  Xiang Xiao,et al.  Towards Attentive, Bi-directional MOOC Learning on Mobile Devices , 2015, ICMI.

[19]  Richard E. Ladner,et al.  ClassInFocus: enabling improved visual attention strategies for deaf and hard of hearing students , 2009, Assets '09.

[20]  Yang-Ting Shen,et al.  SynTag: a web-based platform for labeling real-time video , 2012, CSCW '12.

[21]  Paul Dourish,et al.  Portholes: supporting awareness in a distributed work group , 1992, CHI.

[22]  René F. Kizilcec,et al.  The instructor’s face in video instruction: Evidence from two large-scale field studies. , 2015, Journal of Educational Psychology.

[23]  Michael S. Bernstein,et al.  Talkabout: Making Distance Matter with Small Groups in Massive Classes , 2015, CSCW.

[24]  Herbert H. Clark,et al.  Grounding in communication , 1991, Perspectives on socially shared cognition.

[25]  Johannes Schöning,et al.  Augmenting Social Interactions: Realtime Behavioural Feedback using Social Signal Processing Techniques , 2015, CHI.

[26]  Björn Hartmann,et al.  Chatrooms in MOOCs: all talk and no action , 2014, L@S.

[27]  Bertrand Schneider,et al.  Real-time mutual gaze perception enhances collaborative learning and collaboration quality , 2013, International Journal of Computer-Supported Collaborative Learning.

[28]  Bilge Mutlu,et al.  ARTFul: adaptive review technology for flipped learning , 2013, CHI.

[29]  Patrick Jermann,et al.  A gaze-based learning analytics model: in-video visual feedback to improve learner's attention in MOOCs , 2016, LAK.

[30]  Garriy Shteynberg Shared Attention , 2015, Perspectives on psychological science : a journal of the Association for Psychological Science.

[31]  Bilge Mutlu,et al.  MACH: my automated conversation coach , 2013, UbiComp.

[32]  Helen Crompton,et al.  Learning With Mobile Devices , 2019, Advanced Methodologies and Technologies in Modern Education Delivery.

[33]  Ryan Shaun Joazeiro de Baker,et al.  Automatic Detection of Learning-Centered Affective States in the Wild , 2015, IUI.

[34]  Hao-Chuan Wang,et al.  Using Time-Anchored Peer Comments to Enhance Social Interaction in Online Educational Videos , 2015, CHI.

[35]  Giampiero Salvi,et al.  A gaze-based method for relating group involvement to individual engagement in multimodal multiparty dialogue , 2013, ICMI '13.

[36]  Jaime Teevan,et al.  Displaying mobile feedback during a presentation , 2012, Mobile HCI.