When the Crowd is Not Enough: Improving User Experience with Social Media through Automatic Quality Analysis

被引:4
作者
Pelleg, Dan [1 ]
Rokhlenko, Oleg [1 ]
Szpektor, Idan [1 ]
Gichtein, Eugene A. [2 ]
Guy, Ido [1 ]
机构
[1] Yahoo Labs, Haifa, Israel
[2] Emory Univ, Atlanta, GA 30322 USA
来源
ACM CONFERENCE ON COMPUTER-SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING (CSCW 2016) | 2016年
关键词
Automatic quality evaluation; Quantitative analysis; A/B testing; User engagement;
D O I
10.1145/2818048.2820022
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Social media gives voice to the people, but also opens the door to low-quality contributions, which degrade the experience for the majority of users. To address the latter issue, the prevailing solution is to rely on the "wisdom of the crowds" to promote good content (e.g., via votes or "like" buttons), or to downgrade bad content. Unfortunately, such crowd feedback may be sparse, subjective, and slow to accumulate. In this paper, we investigate the effects, on the users, of automatically filtering question-answering content, using a combination of syntactic, semantic, and social signals. Using this filtering, a large-scale experiment with real users was performed to measure the resulting engagement and satisfaction. To our knowledge, this experiment represents the first reported large-scale user study of automatically curating social media content in real time. Our results show that automated quality filtering indeed improves user engagement, usually aligning with, and often outperforming, crowd-based quality judgments.
引用
收藏
页码:1080 / 1090
页数:11
相关论文
共 43 条
  • [41] Wang XJ, 2009, PROCEEDINGS 32ND ANNUAL INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, P179, DOI 10.1145/1571941.1571974
  • [42] Yun Zhou, 2007, 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, P543, DOI 10.1145/1277741.1277835
  • [43] Zhou Guangyou., 2012, P 21 ACM INT C INF K, P1492