History-based Article Quality Assessment on Wikipedia

被引:51
作者
Zhang, Shiyue [1 ]
Hu, Zheng [1 ]
Zhang, Chunhong [1 ]
Yu, Ke [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
来源
2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP) | 2018年
关键词
Wikipedia; Information Quality; LSTM;
D O I
10.1109/BigComp.2018.00010
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Wikipedia is widely considered as the biggest encyclopedia on Internet. Quality assessment of articles on Wikipedia has been studied for years. Conventional methods addressed this task by feature engineering and statistical machine learning algorithms. However, manually defined features are difficult to represent the long edit history of an article. Recently, researchers proposed an end-to-end neural model which used a Recurrent Neural Network(RNN) to learn the representation automatically. Although RNN showed its power in modeling edit history, the end-to-end method is time and resource consuming. In this paper, we propose a new history-based method to represent an article. We also take advantage of an RNN to handle the long edit history, but we do not abandon feature engineering. We still represent each revision of an article by manually defined features. This combination of deep neural model and feature engineering enables our model to be both simple and effective. Experiments demonstrate our model has better or comparable performance than previous works, and has the potential to work as a real-time service. Plus, we extend our model to do quality prediction.
引用
收藏
页码:1 / 8
页数:8
相关论文
共 30 条
[1]  
[Anonymous], SOCIAL SCI ELECT PUB
[2]  
[Anonymous], 2013, P 9 INT S OP COLL
[3]  
[Anonymous], INT ACM SIGIR C RES
[4]  
[Anonymous], INTEGRATION MULTIPLE
[5]  
[Anonymous], INT S OP COLL
[6]  
[Anonymous], 2009, P 5 INT S WIKIS OPEN, DOI DOI 10.1145/1641309.1641321
[7]  
[Anonymous], 1997, Neural Computation
[8]  
[Anonymous], 2010, P 19 INT C WORLD WID, DOI [DOI 10.1145/1772690.1772847, 10.1145/1772690.1772847]
[9]  
[Anonymous], 2012, P INTERSPEECH 2012 P
[10]  
[Anonymous], OPENSYM 2017