Label-Aware Recurrent Reading for Multi-Label Classification

被引:0
作者
Ming, Shenglan [1 ]
Liu, Huajun [1 ]
Luo, Ziming [1 ]
Huang, Peng [1 ]
Li, Mark Junjie [1 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen, Peoples R China
来源
2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022) | 2022年
关键词
Multi-label Classification; Deep Learning; Gated recurrent unit; Attention mechanism; NEURAL-NETWORKS;
D O I
10.1109/CACML55074.2022.00091
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label classification (MLC) is an essential branch of natural language processing where a given instance may be associated with multiple labels. Recently, neural network approaches invested considerable dependency between labels and the instance, achieving state-of-the-art performance. However, the existing methods ignore the hidden correlations between each document's semantic information and labels. In this paper, inspired by the cognitive process of human reading, we propose a Label-Aware Recurrent Reading (LARD) network based on neuroscience. LARD modeled the MLC problem as a decision-making process of recurrent reading and constructs label-aware document representation according to the top-down mechanism of neuroscience. The model outputs the prediction of all labels after each reading, and in the process of recurrent reading, the prediction accuracy is improved. Besides, the attention mechanism is applied to make the weight of words dynamically adjust according to the topdown classification prediction information, taking into account the different contributions of words to labels. Experiments show that our model has better performance than the existing models.
引用
收藏
页码:498 / 504
页数:7
相关论文
共 24 条
[1]  
Baker S, 2017, BioNLP 2017, P307, DOI [DOI 10.18653/V1/W17-2339, 10.18653/v1/W17-2339]
[2]   Learning multi-label scene classification [J].
Boutell, MR ;
Luo, JB ;
Shen, XP ;
Brown, CM .
PATTERN RECOGNITION, 2004, 37 (09) :1757-1771
[3]   Affective Computing and Sentiment Analysis [J].
Cambria, Erik .
IEEE INTELLIGENT SYSTEMS, 2016, 31 (02) :102-107
[4]  
Chen GB, 2017, IEEE IJCNN, P2377, DOI 10.1109/IJCNN.2017.7966144
[5]  
Conneau A, 2017, 15TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2017), VOL 1: LONG PAPERS, P1107
[6]  
Elisseeff A, 2002, ADV NEUR IN, V14, P681
[7]   Brain states: Top-down influences in sensory processing [J].
Gilbert, Charles D. ;
Sigman, Mariano .
NEURON, 2007, 54 (05) :677-696
[8]  
Gopal S, 2010, SIGIR 2010: PROCEEDINGS OF THE 33RD ANNUAL INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH DEVELOPMENT IN INFORMATION RETRIEVAL, P315
[9]   Deep Pyramid Convolutional Neural Networks for Text Categorization [J].
Johnson, Rie ;
Zhang, Tong .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, :562-570
[10]  
Kim Y., 2014, P 2014 C EMP METH NA, P1746