Relevance-Based Automated Essay Scoring via Hierarchical Recurrent Model

被引:0
作者
Chen, Minping [1 ]
Li, Xia [1 ,2 ]
机构
[1] Guangdong Univ Foreign Studies, Sch Informat Sci & Technol, Guangzhou, Guangdong, Peoples R China
[2] Guangdong Univ Foreign Studies, Key Lab Language Engn & Comp, Guangzhou, Guangdong, Peoples R China
来源
2018 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP) | 2018年
基金
美国国家科学基金会;
关键词
Topic information; Automated essay scoring; Hierarchical Recurrent Neural Networks;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In recent years, neural network models have been used in automated essay scoring task and achieved good performance. However, few studies investigated using the prompt information into the neural network. We know that there is a close relevance between the essay content and the topic. Therefore, the relevance between the essay and the topic can aid to represent the relationship between the essay and its score. That is to say, the degree of relevance between the high score essay and the topic will be higher while the low score essay is less similar to the topic. Inspired by this idea, we propose to use the similarity of the essay and the topic as auxiliary information which can be concatenated into the final representation of the essay. We first use a hierarchical recurrent neural network combined with attention mechanism to learn the content representation of the essay and the topic on sentence-level and document-level. Then, we multiply the essay representation and the topic representation to get a similarity representation between them. In the end, we concatenate the similarity representation into the essay's representation to get a final representation of the essay. We tested our model on ASAP dataset and the experimental results show that our model outperformed the existing state-of-the-art models.
引用
收藏
页码:378 / 383
页数:6
相关论文
共 20 条
  • [1] Alikaniotis D, 2016, PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, P715
  • [2] [Anonymous], P EDMEDIA
  • [3] [Anonymous], 2004, ETS Research Report Series
  • [4] Dong F., 2017, P 21 C COMP NAT LANG, P153, DOI DOI 10.18653/V1/K17-1017
  • [5] Dong F., 2016, P 2016 C EMP METH NA, P968
  • [6] Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1007/978-3-642-24797-2, 10.1162/neco.1997.9.1.1]
  • [7] Kingma D. P., P 3 INT C LEARN REPR
  • [8] An introduction to latent semantic analysis
    Landauer, TK
    Foltz, PW
    Laham, D
    [J]. DISCOURSE PROCESSES, 1998, 25 (2-3) : 259 - 284
  • [9] Larkey L. S., 1998, Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, P90, DOI 10.1145/290941.290965
  • [10] A Hierarchical Neural Autoencoder for Paragraphs and Documents
    Li, Jiwei
    Minh-Thang Luong
    Jurafsky, Dan
    [J]. PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 1106 - 1115