Trajectory-Based Viewport Prediction for 360-Degree Virtual Reality Videos

Viewport-based adaptive streaming has emerged as the main technique to efficiently stream bandwidth-intensive 360° videos over the best-effort Internet. In viewport-based streaming, only the portion of the video watched by the user is usually streamed at the highest quality, by either using video tiling, foveat-based encoding or similar approaches. To release the full potential of these approaches though, the future position of the user viewport has to be predicted. Indeed, accurate viewport prediction is necessary to minimize quality transitions while the user moves. Current solutions mainly focus on short-term prediction horizons (e.g., less than 2 s), while long-term viewport prediction has received less attention. This paper presents a novel prediction algorithm for the long-term prediction of the user viewport. In the proposed algorithm, the viewport evolution over time of a given user is modeled as a trajectory in the roll, pitch, and yaw angles domain. For a given video, a function is extrapolated to model the evolution of the three aforementioned angles over time, based on the viewing patterns of past users in the system. Moreover, trajectories that exhibit similar viewing behaviors are clustered together, and a different function is calculated for each cluster. The pre-computed functions are subsequently used at run-time to predict the future viewport position of a new user in the system, for the specific video. Preliminary results using a public dataset composed of 16 videos watched on average by 61 users show how the proposed algorithm can increase the predicted viewport area by 13% on average compared to several benchmarking heuristics, for prediction horizons up to 10 seconds.

[1]  Cheng-Hsin Hsu,et al.  360° Video Viewing Dataset in Head-Mounted Virtual Reality , 2017, MMSys.

[2]  Nikolaos Papanikolopoulos,et al.  Clustering of Vehicle Trajectories , 2010, IEEE Transactions on Intelligent Transportation Systems.

[3]  Kai Zhang,et al.  Probabilistic Viewport Adaptive Streaming for 360-degree Videos , 2018, 2018 IEEE International Symposium on Circuits and Systems (ISCAS).

[4]  Yue Wang,et al.  CUB360: Exploiting Cross-Users Behaviors for Viewport Prediction in 360 Video Adaptive Streaming , 2018, 2018 IEEE International Conference on Multimedia and Expo (ICME).

[5]  Miska M. Hannuksela,et al.  HEVC-compliant Tile-based Streaming of Panoramic Video for Virtual Reality Applications , 2016, ACM Multimedia.

[6]  Zhenhua Li,et al.  A Measurement Study of Oculus 360 Degree Video Streaming , 2017, MMSys.

[7]  Feng Qian,et al.  Optimizing 360 video delivery over cellular networks , 2016, ATC@MobiCom.

[8]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[9]  Gwendal Simon,et al.  360-Degree Video Head Movement Dataset , 2017, MMSys.

[10]  Xin Liu,et al.  Shooting a moving target: Motion-prediction-based transmission for 360-degree videos , 2016, 2016 IEEE International Conference on Big Data (Big Data).

[11]  Xin Liu,et al.  Motion-Prediction-Based Multicast for 360-Degree Video Transmissions , 2017, 2017 14th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON).

[12]  Filip De Turck,et al.  Improving Virtual Reality Streaming using HTTP/2 , 2017, MMSys.

[13]  Gwendal Simon,et al.  Optimal Set of 360-Degree Videos for Viewport-Adaptive Streaming , 2017, ACM Multimedia.