Web-based platform for a customizable and synchronized presentation of subtitles in single- and multi-screen scenarios

This paper presents a web-based platform for a customized and synchronized presentation of subtitles in both single- and multi-screen scenarios. The platform enables a dynamic user-level customization of subtitles in terms of format (font family, size, color, transparency...) and position, according to the users’ preferences and/or needs. It also allows adjusting the number of subtitle lines to be presented, being able to skip to the corresponding playout position of the beginning of a specific line by clicking on it. Likewise, multiple languages can be simultaneously presented, and a delay offset to the presentation of subtitles can be applied. All these functionalities can also be available on companion devices, by associating them to the session on the main screen. This enables the presentation of subtitles in a synchronized manner with the content on the main screen and their independent customization. The platform provides support for different subtitle formats, as well as for HTML5 and Youtube videos. It includes a module to upload videos and their subtitle files, and to manage playlists. Overall, the platform enables personalized and more engaging consumption experiences, contributing to improve the Quality of Experience (QoE). It can additionally provide benefits in a variety of scenarios, such as language learning, crowded multi-culture and noisy environments. The results from a subjective evaluation study, with the participation of 40 users without accessibility needs, reveal that the platform can provide relevant benefits for the whole spectrum of consumers. In particular, users have been very satisfied with the usability, attractiveness, effectiveness and usefulness of all features of the platform. Graphical abstract Demo video: https://goo.gl/TdixNz

[1]  Mario Montagud,et al.  ImAc: Enabling Immersive, Accessible and Personalized Media Experiences , 2018, TVX.

[2]  Andy Brown,et al.  Understanding the Diverse Needs of Subtitle Users in a Rapidly Evolving Media Landscape , 2016 .

[3]  Azam Bastanfard,et al.  A Novel Multimedia Educational Speech Therapy System for Hearing Impaired Children , 2010, PCM.

[4]  Andy Brown,et al.  Dynamic Subtitles: The User Experience , 2015, TVX.

[5]  Mario Montagud,et al.  Use of Web Components to Develop Interactive, Customizable and Multi-device Video Consumption Platforms , 2015 .

[6]  M. Porteiro The use of subtitles in speech-language therapy , 2013 .

[7]  Mario Montagud,et al.  ImmersiaTV: enabling customizable and immersive multi-screen TV experiences , 2018, MMSys.

[8]  Christian Keimel,et al.  On Time or Not on Time: A User Study on Delays in a Synchronised Companion-Screen Experience , 2017, TVX.

[9]  Jan Kautz,et al.  Speaker-Following Video Subtitles , 2015, TOMM.

[10]  P. Orero,et al.  LOURDES LORENZO: Criteria for elaborating subtitles for deaf and hard of hearing children in Spain: A guide of good practice 139 , 2010 .

[11]  Rocío Baños,et al.  Corpus linguistics and Audiovisual Translation: in search of an integrated approach , 2013 .

[12]  Mike Armstrong The Development of a Methodology to Evaluate the Perceived Quality of Live TV Subtitles , 2013 .

[13]  C. Jensema,et al.  Eye Movement Patterns of Captioned Television Viewers , 2000, American annals of the deaf.

[14]  Pilar Orero,et al.  ANA PEREIRA: Criteria for elaborating subtitles for deaf and hard of hearing adults in Spain: Description of a case study 87 , 2010 .

[15]  Fernando Boronat,et al.  Inter-destination multimedia synchronization: schemes, use cases and standardization , 2012, Multimedia Systems.

[16]  Andy Brown,et al.  Understanding the Diverse Needs of Subtitle Users in a Rapidly Evolving Media Landscape , 2016 .

[17]  Pilar Orero,et al.  EDUARD BARTOLL, ANJANA MARTÍNEZ TEJERINA: The positioning of subtitles for the deaf and hard of hearing 69 , 2010 .

[18]  David Geerts,et al.  Synchronization for Secondary Screens and Social TV: User Experience Aspects , 2018, MediaSync, Handbook on Multimedia Synchronization.

[19]  Fernando Boronat,et al.  How to perform AMP? Cubic adjustments for improving the QoE , 2017, Comput. Commun..

[20]  Anna Foerster Towards a creative approach in subtitling: a case study , 2010 .

[21]  Deborah I. Fels,et al.  Emotive captioning , 2007, CIE.

[22]  Mario Montagud,et al.  HbbTV-Compliant Platform for Hybrid Media Delivery and Synchronization on Single- and Multi-Device Scenarios , 2018, IEEE Transactions on Broadcasting.

[23]  Rajiv Ramdhany,et al.  Enabling Frame-Accurate Synchronised Companion Screen Experiences , 2016, TVX.

[24]  Jordi Carrabina,et al.  Subtitle Synchronization across Multiple Screens and Devices , 2012, Sensors.

[25]  M. Oskar van Deventer,et al.  Media Synchronisation for Television Services Through HbbTV , 2018, MediaSync, Handbook on Multimedia Synchronization.

[26]  Jean-marc Lavaur,et al.  Languages on the screen: is film comprehension related to the viewers' fluency level and to the language in the subtitles? , 2011, International journal of psychology : Journal international de psychologie.

[27]  Meng Wang,et al.  Video accessibility enhancement for hearing-impaired users , 2011, TOMCCAP.

[28]  Deborah I. Fels,et al.  Using Placement and Name for Speaker Identification in Captioning , 2010, ICCHP.

[29]  Helen Petrie,et al.  Working With Participants , 2019, Web Accessibility.

[30]  Helen Petrie,et al.  The Evaluation of Accessibility, Usability, and User Experience , 2009, The Universal Access Handbook.

[31]  Matt Jones,et al.  Designing attention for multi-screen TV experiences , 2015, BCS HCI.

[32]  Juan González,et al.  Web-based Platform for Subtitles Customization and Synchronization in Multi-Screen Scenarios , 2017, TVX.

[33]  Matt Jones,et al.  Cross-device media: a review of second screening and multi-device television , 2017, Personal and Ubiquitous Computing.