Evaluating feedback devices for time-continuous mobile multimedia quality assessment

In January 2014, the new ITU-T P.913 recommendation for measuring subjective video, audio and multimedia quality in any environment has been published. This document does not contain any time-continuous subjective method. However, environmental parameter values are changing continuously in a majority of outdoor and also most indoor environments. To be aware of their impact on the perceived quality, a time-continuous quality assessment methodology is necessary. In previous standards, targeting laboratory-based test settings, a desk-mounted slider of substantial size is recommended. Unfortunately, there are many environments where such a device cannot be used.In this paper, new feedback tools for mobile time-continuous rating are presented and analysed. We developed several alternatives to the generally adopted desk-mounted slider as a rating device. In order to compare the tools, we defined a number of performance measures that can be used in further studies. The suitability and efficacy of the rating scheme based on measurable parameters as well as user opinions is compared. One method, the finger count, seems to outperform the others from all points of view. It was been judged to be easy to use with low potential for distractions. Furthermore, it reaches a similar precision level as the slider, while requiring lower user reaction and scoring times. Low reaction times are particularly important for time-continuous quality assessment, where the reliability of a mapping between impairments and user ratings plays an essential role. HighlightsWe developed a scheme to objectively compare subjective rating methods.We tested this scheme by comparing several rating devices.Users would like to express experienced quality with a simple finger count method.The finger count method outperformed other approaches also objectively.Haptic feed-back distracts the user from his task instead of providing help.

[1]  Helmut Hlavacs,et al.  Slider or glove? Proposing an alternative quality rating methodology , 2010 .

[2]  J. Hartung,et al.  Lehr- und Handbuch der angewandten Statistik , 2005 .

[3]  ITU-T Rec. P.910 (04/2008) Subjective video quality assessment methods for multimedia applications , 2009 .

[4]  Toon De Pessemier,et al.  Quantifying Subjective Quality Evaluations for Mobile Video Watching in a Semi-Living Lab Context , 2012, IEEE Transactions on Broadcasting.

[5]  Dimitri Schuurman,et al.  A living lab research approach for mobile TV , 2009, EuroITV '09.

[6]  John Verzani,et al.  Using R for introductory statistics , 2018 .

[7]  Alan C. Bovik,et al.  A subjective study to evaluate video quality assessment algorithms , 2010, Electronic Imaging.

[8]  Rajiv Soundararajan,et al.  Study of Subjective and Objective Quality Assessment of Video , 2010, IEEE Transactions on Image Processing.

[9]  Antitza Dantcheva,et al.  TEST EQUIPMENT OF TIME-VARIANT SUBJECTIVE PERCEPTUAL VIDEO QUALITY IN MOBILE TERMINALS , 2005 .

[10]  Stefan Winkler,et al.  On the properties of subjective ratings in video quality experiments , 2009, 2009 International Workshop on Quality of Multimedia Experience.

[11]  Sugato Chakravarty,et al.  Methodology for the subjective assessment of the quality of television pictures , 1995 .

[12]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[13]  Margaret H. Pinson,et al.  Comparing subjective video quality testing methodologies , 2003, Visual Communications and Image Processing.

[14]  Peter Lambert,et al.  Assessing Quality of Experience of IPTV and Video on Demand Services in Real-Life Environments , 2010, IEEE Transactions on Broadcasting.

[15]  Manfred Tscheligi,et al.  Trends in the living room and beyond: results from ethnographic studies using creative and playful probing , 2008, CIE.