MULTI-RELATION MESSAGE PASSING FOR MULTI-LABEL TEXT CLASSIFICATION

被引:11
作者
Ozmen, Muberra [1 ]
Zhang, Hao [2 ]
Wang, Pengyun [3 ]
Coates, Mark [1 ]
机构
[1] McGill Univ, Montreal, PQ, Canada
[2] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[3] Huawei Noahs Ark Lab, Shenzhen, Peoples R China
来源
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2022年
基金
加拿大自然科学与工程研究理事会;
关键词
multi-label classification; text classification; multi-relation GNNs;
D O I
10.1109/ICASSP43922.2022.9747225
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
A well-known challenge associated with the multi-label classification problem is modelling dependencies between labels. Most attempts at modelling label dependencies focus on co-occurrences, ignoring the valuable information that can be extracted by detecting label subsets that rarely occur together. For example, consider customer product reviews; a product probably would not simultaneously be tagged by both "recommended" (i.e., reviewer is happy and recommends the product) and "urgent" (i.e., the review suggests immediate action to remedy an unsatisfactory experience). Aside from the consideration of positive and negative dependencies, the direction of a relationship should also be considered. For a multi-label image classification problem, the "ship" and "sea" labels have an obvious dependency, but the presence of the former implies the latter much more strongly than the other way around. These examples motivate the modelling of multiple types of bi-directional relationships between labels. In this paper, we propose a novel method, entitled Multi-relation Message Passing (MrMP), for the multi-label classification problem. Experiments on benchmark multi-label text classification datasets show that the MrMP module yields similar or superior performance compared to state-of-the-art methods. The approach imposes only minor additional computational and memory overheads.(1)
引用
收藏
页码:3583 / 3587
页数:5
相关论文
共 25 条
[1]  
Bai J., 2020, P INT JOINT C ART IN
[2]   HARAM: a Hierarchical ARAM neural network for large-scale text classification [J].
Benites, Fernando ;
Sapozhnikova, Elena .
2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW), 2015, :847-854
[3]  
Bhatia Kush, 2015, Advances in Neural Information Processing Systems, V28
[4]  
Chen C, 2019, AAAI CONF ARTIF INTE, P3304
[5]   Global-connected network with generalized ReLU activation [J].
Chen, Zhi ;
Ho, Pin-Han .
PATTERN RECOGNITION, 2019, 96
[6]  
Dembczynski Krzysztof, 2010, P 27 INT C INT C MAC, P279
[7]  
Katakis I., 2008, P ECML PKDD DISCOVER
[8]  
King DB, 2015, ACS SYM SER, V1214, P1, DOI 10.1021/bk-2015-1214.ch001
[9]  
Kipf T. N., 2017, ICLR, P1, DOI https://doi.org/10.48550/arXiv.1609.02907
[10]   Neural Message Passing for Multi-label Classification [J].
Lanchantin, Jack ;
Sekhon, Arshdeep ;
Qi, Yanjun .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 :138-163