logo

SCIENCE CHINA Information Sciences, Volume 63 , Issue 10 : 202301(2020) https://doi.org/10.1007/s11432-019-2734-y

Measuring quality of experience for 360-degree videos in virtual reality

More info
  • ReceivedJul 18, 2019
  • AcceptedNov 15, 2019
  • PublishedAug 19, 2020

Abstract

In recent years, we witness dramatic growing attention in immersive media technologies like 360-degree videos and virtual reality (VR). However, measuring the quality-of-experience (QoE) for 360-degree VR videos is not a trivial task. Streaming such videos to head mounted displays (HMDs) is extremely bandwidth-demanding when compared to traditional 2D videos. In HTTP adaptive streaming, QoE tends to deteriorate significantly during fluctuating network conditions, which results in various bitrate changes and causes multiple stalling events during playback. Thus, understanding how the human visual system perceives 360-degree video with the effect of stalling and different bitrate levels becomes inevitable. In this paper, we investigate the impact of stalling on users QoE under different bitrate levels and the interaction between stalling event and bitrate level for 360-degree videos in VR. To aim this, we first build a 360-degree videos database by encoding videos in three different bitrate levels (1, 5, and 15 Mbps) with 4K resolutions ($3840\times1920$ pixels). We then simulate various stalling events in the videos and conduct a subjective experiment in a virtual reality environment to investigate the human responses. Finally, we use a Bayesian method to estimate and predict the QoE while measuring the quality drop owing to various stalling events and bitrate changes. Proposed solution and prediction results show a strong dependency between playback stalling and bitrate of 360-degree video in VR. Stalling always impacts the QoE of 360-degree videos, but the strength of this negative impact depends on the video bitrate level. The adverse effect of stalling events is more profound when bitrate level approaches to the high and low end, which is in close agreement with subjective opinion.


References

[1] Le Callet P, Möller S, Perkis A. Qualinet white paper on definitions of quality of experience (2012). European Network on Quality of Experience in Multimedia Systems and Services (COST Action IC 1003). version 1.2. 2013. Google Scholar

[2] Liu Y T, Yun S, Mao Y N, et al. A study on quality of experience for adaptive streaming service. In: Proceedings of IEEE International Conference on Communications Workshops (ICC), 2013. 682--686. Google Scholar

[3] Lewcio B, Belmudez B, Enghardt T, et al. On the way to high-quality video calls in future mobile networks. In: Proceedings of 2011 3rd International Workshop on Quality of Multimedia Experience (QoMEX), 2011. 43--48. Google Scholar

[4] Zare A, Aminlou A, Hannuksela M M, et al. HEVC-compliant tile-based streaming of panoramic video for virtual reality applications. In: Proceedings of Proceedings of the 24th ACM International Conference on Multimedia, 2016. 601--605. Google Scholar

[5] Qian F, Ji L, Han B, et al. Optimizing 360 video delivery over cellular networks. In: Proceedings of the 5th Workshop on All Things Cellular: Operations, Applications and Challenges, 2016. 1--6. Google Scholar

[6] Anwar M S, Wang J, Ullah A, Khan W, Li Z, and Ahmad S. User profile analysis for enhancing QoE of 360 panoramic video in virtual reality environment. In: Proceedings of 2018 International Conference on Virtual Reality and Visualization (ICVRV), 2018. 106--111. Google Scholar

[7] Zhang B, Zhao J, Yang S, et al. Subjective and objective quality assessment of panoramic videos in virtual reality environments. In: Proceedings of IEEE International Conference on Multimedia & Expo Workshops (ICMEW), 2017. 163--168. Google Scholar

[8] Li C, Xu M, Du X Z, et al. Bridge the gap between VQA and human behavior on omnidirectional video. 2018. 932--940. Google Scholar

[9] Schatz R, Sackl A, Timmerer C, et al. Towards subjective quality of experience assessment for omnidirectional video streaming. In: Proceedings of 2017 9th International Conference on Quality of Multimedia Experience (QoMEX), 2017. 1--6. Google Scholar

[10] Zou W, Yang F. Measuring quality of experience of novel 360-degree streaming video during stalling. In: Proceedings of International Conference on Communicatins and Networking in China, 2017. 417--424. Google Scholar

[11] Zhang W, Zou W, Yang F. The impact of stalling on the perceptual quality of http-based omnidirectional video streaming. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2019. 4060--4064. Google Scholar

[12] Chao Chen , Lark Kwon Choi , de Veciana G. Modeling the time--varying subjective quality of HTTP video streams with rate adaptations.. IEEE Trans Image Process, 2014, 23: 2206-2221 CrossRef PubMed Google Scholar

[13] Singla A, Fremerey S, Robitza W, et al. Measuring and comparing qoe and simulator sickness of omnidirectional videos in different head mounted displays. In: Proceedings of 2017 9th International Conference on Quality of Multimedia Experience (QoMEX), 2017. 1--6. Google Scholar

[14] Huyen T, Ngoc N P, Pham C T, et al. A subjective study on QoE of 360 video for VR communication. In: Proceedings of 2017 IEEE 19th International Workshop on Multimedia Signal Processing (MMSP), 2017. 1--6. Google Scholar

[15] Tasaka S. Bayesian Hierarchical Regression Models for QoE Estimation and Prediction in Audiovisual Communications. IEEE Trans Multimedia, 2017, 19: 1195-1208 CrossRef Google Scholar

[16] Ahmad S, Li K, Amin A. A Multilayer Prediction Approach for the Student Cognitive Skills Measurement. IEEE Access, 2018, 6: 57470-57484 CrossRef Google Scholar

[17] Zhang Y, Wang Y, Liu F. Subjective Panoramic Video Quality Assessment Database for Coding Applications. IEEE Trans Broadcast, 2018, 64: 461-473 CrossRef Google Scholar

[18] Corbillon X, de Simone F, Simon G. 360-degree video head movement dataset. In: Proceedings of the 8th ACM on Multimedia Systems Conference, 2017. 199--204. Google Scholar

[19] Lo W-C, Fan C-L, Lee J, et al. 360 video viewing dataset in head-mounted virtual reality. In: Proceedings of the 8th ACM on Multimedia Systems Conference, 2017. 211--216. Google Scholar

[20] Wu C, Tan Z, Wang Z, et al. A dataset for exploring user behaviors in vr spherical video streaming. In: Proceedings of the 8th ACM on Multimedia Systems Conference, 2017. 193--198. Google Scholar

[21] Staelens N, Moens S, Van den Broeck W. Assessing Quality of Experience of IPTV and Video on Demand Services in Real-Life Environments. IEEE Trans Broadcast, 2010, 56: 458-466 CrossRef Google Scholar

[22] ITU-T P.910, subjective video quality assessment methods for multimedia applications. 2008. Google Scholar

[23] ITU-R BT, 500-12, recommendation: Methodology for the subjective assessment of the quality of television pictures. 1993. Google Scholar

[24] Yao S-H, Fan C-L, Hsu C-H. Towards quality-of-experience models for watching 360 videos in head-mounted virtual reality. In: Proceedings of 2019 11th International Conference on Quality of Multimedia Experience (QoMEX), 2019. 1--3. Google Scholar

  • Figure 1

    (Color online) Source sequence thumbnails. (a) Idaho Boat; (b) Animation 1; (c) Science Fiction HELP; protectłinebreak (d) Gyrocopter; (e) Snow driving; (f) Animation 2;. (g) Skiing; (h) Military parade; (i) Beach volleyball; (j) Rio Olympics; (k) Cockpit view; (l) Undersea; (m) Roller coaster; (n) Animation 3; (o) Surrounded by Elephants; (p) Project Soul.

  • Figure 2

    (Color online) PLCC and SRCC performance evaluation of individual subjects.

  • Figure 3

    (Color online) MOS of all compressed videos in the absence of stalling.

  • Figure 4

    (Color online) The posterior QoE with three different video bitrates in the absence of stalling (a), and in the presence of initial stalling (b), mid stalling (c), multiple stalling (d).

  • Figure 5

    (Color online) Prediction of overall QoE.

  • Figure 6

    (Color online) PLCC and SRCC performance evaluation of the average MOS and predicted values.

  • Table 1  

    Table 1The details of source 360-degree videos

    Index360-degree videos nameDescriptionFrame rate (fps)SI/TI
    (a) Idaho Dinghy BoatHuman, high motion3059.9/66.8
    (b) Animation 1Outdoor scene2448.3/38.4
    (c) Science Fiction HELPHuman, indoor3028.5/18.4
    (d) Skyhub Dubai, GyrocopterHuman, architecture 3042.9/14.7
    (e) Snow drivingHuman, nature2554.7/22.4
    (f) Animation 2Snake in the forest2449.1/0.9
    (g) SkiingHuman, high motion2554.1/1.5
    (h) Military paradeHuman, outdoor3050.1/16.1
    (i) Beach volleyballHuman, high motion2430.6/7.4
    (j) Rio OlympicsHuman, outdoor2456.8/15.5
    (k) Cockpit viewHuman, indoor2559.5/21.7
    (l) UnderseaHuman, nature3032.1/7.1
    (m) Roller coasterHuman, high motion3078.3/48.1
    (n) Animation 3Outdoor scene2464.7/56.2
    (o) Surrounded by ElephantsWild, nature3036.2/2.8
    (p) Project SoulHuman, indoor3027.5/11.9
  •   

    Algorithm 1 Bitrate posterior QoE estimation

    Require:sets QoE, bitrates.

    Output:sets ${\rm~BR}_m$ $(m=1,2,3)$. Initialization:

    Iterations bitrate:

    for each $i$ in bitrates

    for each $j$ in QoE

    Calculate posterior of $j$ with respect to $i$;

    Add posterior to ${\rm~BR}_m$;

    end for

    return ${\rm~BR}_m$;

    Increment ${\rm~BR}_m$ to ${\rm~BR}_{m+1}$;

    end for

  •   

    Algorithm 2 Stalling experience posterior QoE estimation under different bitrate levels

    Require:sets ${\rm~BR}_1$, ${\rm~BR}_2$, ${\rm~BR}_3$, Stalling.

    Output:sets ${\rm~ST}_p$ $(p=1,2,3)$, ${\rm~ST}_q$ $(q=1,2,3)$, and ${\rm~ST}_r$ $(r=1,2,3)$. Initialization:

    Iterations of Stalling:

    for each $a$ in Stalling

    for each $b$ in ${\rm~BR}_1$

    Calculate posterior of $a$ with respect to $b$;

    Add posterior to ${\rm~ST}_p$;

    end for

    return ${\rm~ST}_p$;

    Increment ${\rm~ST}_p$ to ${\rm~ST}_{p+1}$;

    for each $c$ in ${\rm~BR}_2$

    Calculate posterior of $a$ with respect to $c$;

    Add posterior to ${\rm~ST}_q$;

    end for

    return ${\rm~ST}_q$;

    Increment ${\rm~ST}_q$ to ${\rm~ST}_{q+1}$;

    for each $d$ in ${\rm~BR}_3$

    calculate posterior of $a$ with respect to $d$;

    add posterior to ${\rm~ST}_r$;

    end for

    return ${\rm~ST}_r$;

    Increment ${\rm~ST}_r$ to ${\rm~ST}_{r+1}$;

    end for

  • Table 2  

    Table 2Average MOS values for various stalling events under different bitrate levels

    Stalling event Bitrate (1 Mbps) Bitrate (5 Mbps) Bitrate (15 Mbps)
    No stalling 3.94 6.3 8.67
    Initial stalling 3.49 5.66 6.94
    Middle stalling 3.3 5.04 6.22
    Multiple stalling 2.79 4.38 5.44
    No stalling – initial stalling 0.45 0.64 1.73
    No stalling – middle stalling 0.64 1.26 2.45
    No stalling – multiple stalling 1.15 1.92 3.23
  • Table 3  

    Table 3Performance comparsion of the proposed technique

    Model PLCC SRCC
    Model 1 0.6769 0.6861
    Model 2 0.7190 0.7158
    Model 3 0.7476 0.7205
    Model 4 0.7723 0.7205
    Model 5 0.7905 0.7430
    Model 6 0.7395 0.6414
    Model 7 0.7455 0.6560
    Proposed technique 0.8095 0.7954

Copyright 2020  CHINA SCIENCE PUBLISHING & MEDIA LTD.  中国科技出版传媒股份有限公司  版权所有

京ICP备14028887号-23       京公网安备11010102003388号