NarGNN: Narrative Graph Neural Networks for New Script Event Prediction Problem

被引:4
作者
Yang, Shuang [1 ,2 ]
Wang, Fali [1 ,2 ]
Zha, Daren [2 ]
Xue, Cong [2 ]
Tang, Zhihao [2 ]
机构
[1] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
[2] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
来源
2020 IEEE INTL SYMP ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, INTL CONF ON BIG DATA & CLOUD COMPUTING, INTL SYMP SOCIAL COMPUTING & NETWORKING, INTL CONF ON SUSTAINABLE COMPUTING & COMMUNICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2020) | 2020年
基金
中国国家自然科学基金;
关键词
narrative event evolutionary graph; script event prediction; graph neural network;
D O I
10.1109/ISPA-BDCloud-SocialCom-SustainCom51426.2020.00086
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Scripts encode world knowledge that can help text understanding, and script learning is to automatically learn this knowledge from unstructured text. Previous script event prediction only considers predicting subsequent events given an existing event sequence. However, in real life, we want to predict events at any location rather than just the last event, because we can control subsequent events by preventing or promoting intermediate events. To this end, we introduce a novel approach, named NarGNN, which can integrate the factual knowledge and experiential knowledge to predict the intermediate event. Specifically, we first construct an event evolutionary graph from the newswire corpus. Then we use Fact Encoder Layer to encode existing event facts, including the semantic information of the event itself and the sequence information among events. Third, we use Fusion Layer to fuse the graph information containing experiential knowledge and the embeddings of existing facts obtained from the previous layer. Fourth, Attention Layer is used to choose the most reasonable result. Finally, our proposed model is evaluated on widely used New York Times corpus and the results demonstrate significant improvements compared with state-of-the-art methods. Also, it is worth noting that NarGNN can be naturally extended to address the previous task with better performances than other methods.
引用
收藏
页码:481 / 488
页数:8
相关论文
共 23 条
[1]  
[Anonymous], 1975, REPRESENTATION UNDER
[2]  
Chambers Nathanael, 2008, P ACL 08, P789
[3]  
Chaturvedi S., 2017, P EMNLP ASS COMP LIN, P1603
[4]  
Fillmore Charles J., 1982, Linguistics in the Morning Calm: Selected Papers from SICOL-1981, P111, DOI 10.1016/B0-08-044854-2/00424-7
[5]  
Gori M, 2005, IEEE IJCNN, P729
[6]  
Granroth-Wilding M, 2016, AAAI CONF ARTIF INTE, P2727
[7]   node2vec: Scalable Feature Learning for Networks [J].
Grover, Aditya ;
Leskovec, Jure .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :855-864
[8]  
Jans B., 2012, EACL 2012, P336
[9]  
Lee IT, 2018, AAAI CONF ARTIF INTE, P4840
[10]  
Li Y., 2016, P ICLR 16, DOI DOI 10.48550/ARXIV.1511.05493