In Vivo or in Vitro? Influence of the Study Design on Crowdsourced Video QoE

被引:0
作者
Borchert, Kathrin [1 ]
Schwind, Anika [1 ]
Hirth, Matthias [2 ]
Hossfeld, Tobias [1 ]
机构
[1] Univ Wurzburg, Inst Comp Sci, Wurzburg, Germany
[2] Tech Univ Ilmenau, Inst Media Technol, Ilmenau, Germany
来源
2019 ELEVENTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX) | 2019年
关键词
crowdsourcing; video QoE; study design; crowdtesting; user study;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Evaluating the QoE of video streaming and its influence factors has become paramount for streaming providers, as they want to maintain high satisfaction for their customers. In this context, crowdsourced user studies became a valuable tool to evaluate different factors which can affect the perceived user experience on a large scale. In general, we observed that most of these crowdsourcing studies either use an in vivo or an in vitro design. In vivo design means that the study participant has to rate the QoE of a video that is embedded in an application similar to a real streaming service, e.g., YouTube or Netflix. In vitro design refers to a setting, in which the video stream is separated from a specific service and thus, the video plays on a plain background. Although these designs vary widely, the results are often compared and generalized. Therefore, in this work, we investigate the influence of these two study design alternatives on the perceived QoE. In crowdsourced user studies, participants rate the video streaming with respect to different stalling patterns (no stalling, different positions) and study designs (in vivo or in vitro). Contrary to our expectations, the results indicate that there is statistically no significant influence of the study design on the perceived video QoE and acceptance. In addition, we found that the in vivo design does not reduce the test takers' attentiveness.
引用
收藏
页数:6
相关论文
共 20 条
  • [1] [Anonymous], 2013, PROCEEDING EUROPEAN
  • [2] [Anonymous], 2017, Cisco Visual Networking Index: Global Mobile Data Traffic Forecast Update, 2016-2021
  • [3] [Anonymous], 2015, IEEE COMMUNICATIONS
  • [4] Quality Control in Crowdsourcing: A Survey of Quality Attributes, Assessment Techniques, and Assurance Actions
    Daniel, Florian
    Kucherbaev, Pavel
    Cappiello, Cinzia
    Benatallah, Boualem
    Allahbakhsh, Mohammad
    [J]. ACM COMPUTING SURVEYS, 2018, 51 (01)
  • [5] Egger-Lampl S., 2017, EVALUATION CROWD CRO
  • [6] Gardlo B., 2014, P INT C COMM JUN
  • [7] Hirth M., 2011, P C INN MOB INT SERV
  • [8] Best Practices for QoE Crowdtesting: QoE Assessment With Crowdsourcing
    Hossfeld, Tobias
    Keimel, Christian
    Hirth, Matthias
    Gardlo, Bruno
    Habigt, Julian
    Diepold, Klaus
    Phuoc Tran-Gia
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2014, 16 (02) : 541 - 558
  • [9] Kazai G., 2011, P C INF KNOWL MAN OC
  • [10] Keimel C., 2012, P PICT COD S MAY