Hierarchical text classification using CNNs with local approaches

被引:0
|
作者
Krendzelak M. [1 ]
Jakab F. [1 ]
机构
[1] Technical University of Košice, Faculty of Electrical Engineering and Informatics, Department of Computers and Informatics, Letná 9, Košice
关键词
Convolutional neural network; Hierarchical text classification; Local topdown approach;
D O I
10.31577/CAI_2020_5_907
中图分类号
学科分类号
摘要
In this paper, we discuss the application of convolutional neural networks (CNNs) for hierarchical text classification using local top-down approaches. We present experimental results implementing a local classification per node approach, a local classification per parent node approach, and a local classification per level approach. A 20Newsgroup hierarchical training dataset with more than 20 categories and three hierarchical levels was used to train the models. The experiments involved several variations of hyperparameters settings such as batch size, embedding size, and number of available examples from the training dataset, including two variation of CNN model text embedding such as static (stat) and random (rand). The results demonstrated that our proposed use of CNNs outperformed at CNN baseline model and both the at and hierarchical support vector machine (SVM) and logistic regression (LR) baseline models. In particular, hierarchical text classification with CNN-stat models using local per parent node and local per level approaches achieved compelling results and outperformed the former and latter state-of-the-art models. However, using CNN with local per node approach for hierarchical text classification underperformed and achieved worse results. Furthermore, we performed a detailed comparison between the proposed hierarchical local approaches with CNNs. The results indicated that the hierarchical local classification per level approach using the CNN model with static text embedding achieved the best results, surpassing the at SVM and LR baseline models by 7% and 13 %, surpassing the at CNN baseline by 5 %, and surpassing the h-SVM and h-LR models by 5% and 10 %, respectively. © 2021 Slovak Academy of Sciences. All rights reserved.
引用
收藏
页码:907 / 924
页数:17
相关论文
共 50 条
  • [1] HIERARCHICAL TEXT CLASSIFICATION USING CNNS WITH LOCAL APPROACHES
    Krendzelak, Milan
    Jakab, Frantisek
    COMPUTING AND INFORMATICS, 2020, 39 (05) : 907 - 924
  • [2] An analysis of hierarchical text classification using word embeddings
    Stein, Roger Alan
    Jaques, Patricia A.
    Valiati, Joao Francisco
    INFORMATION SCIENCES, 2019, 471 : 216 - 232
  • [3] Experiments with hierarchical text classification
    Granitzer, M
    Auer, P
    PROCEEDINGS OF THE NINTH IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, 2005, : 177 - 182
  • [4] Hierarchical text classification methods and their specification
    Sun, AX
    Lim, EP
    Ng, WK
    COOPERATIVE INTERNET COMPUTING, 2003, 729 : 236 - 256
  • [5] Hierarchical Text Classification Incremental Learning
    Song, Shengli
    Qiao, Xiaofei
    Chen, Ping
    NEURAL INFORMATION PROCESSING, PT 1, PROCEEDINGS, 2009, 5863 : 247 - 258
  • [6] Exploring Different Normalization and Classification Approaches for Mammography Analysis with CNNs
    Perre, Ana C.
    Alexandre, Luis A.
    Freire, Luis C.
    APPLICATIONS OF INTELLIGENT SYSTEMS, 2018, 310 : 315 - 323
  • [7] Adaptive Hierarchical Text Classification Using ERNIE and Dynamic Threshold Pruning
    Chen, Han
    Zhang, Yangsen
    Jiang, Yuru
    Duan, Ruixue
    IEEE ACCESS, 2024, 12 : 193641 - 193652
  • [8] Disentangled feature graph for Hierarchical Text Classification
    Liu, Renyuan
    Zhang, Xuejie
    Wang, Jin
    Zhou, Xiaobing
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (03)
  • [9] Text Classification with Imperfect Hierarchical Structure Knowledge
    Ngo-Ye, Thomas
    Dutt, Abhijit
    AMCIS 2010 PROCEEDINGS, 2010,
  • [10] JumpLiteGCN: A Lightweight Approach to Hierarchical Text Classification
    Liu, Teng
    Liu, Xiangzhi
    Dong, Yunfeng
    Wu, Xiaoming
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT IV, NLPCC 2024, 2025, 15362 : 54 - 66