Contrastive learning based on linguistic knowledge and adaptive augmentation for text classification

被引:0
|
作者
Zhang, Shaokang [1 ]
Ran, Ning [2 ]
机构
[1] Hebei Univ, Sch Cyber Secur & Comp, Baoding, Peoples R China
[2] Hebei Univ, Coll Elect & Informat Engn, Baoding, Peoples R China
基金
中国国家自然科学基金;
关键词
Text classification; Contrastive learning; Linguistic knowledge; Adaptive data augmentation; REPRESENTATION;
D O I
10.1016/j.knosys.2024.112189
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pre-trained language models based on contrastive learning have shown to be effective in text classification. Despite its great success, contrastive learning still has some limitations. First, external linguistic knowledge has been shown to improve the performance of pre-trained language models, but how to use it in contrastive learning is still unclear. Second, general contrastive learning generates training samples with fixed data augmentation during the whole training period, while different augmentation methods are suitable for different downstream tasks. Fixed data augmentation can lead to suboptimal settings. In this paper, we propose contrastive learning based on linguistic knowledge and adaptive augmentation, which can obtain high-quality sentence representations to improve the performance of text classification. Specifically, we construct wordlevel positive and negative sample pairs by WordNet and propose a novel word-level contrastive learning function to inject linguistic knowledge. Then we dynamically select the augmentation policy by alignment and uniformity. This adaptive augmentation policy can acquire more generalized sentence representations with little computational overhead. Experiments on multiple public datasets demonstrate that our method outperforms state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Contrastive learning with text augmentation for text classification
    Jia, Ouyang
    Huang, Huimin
    Ren, Jiaxin
    Xie, Luodi
    Xiao, Yinyin
    APPLIED INTELLIGENCE, 2023, 53 (16) : 19522 - 19531
  • [2] Contrastive learning with text augmentation for text classification
    Ouyang Jia
    Huimin Huang
    Jiaxin Ren
    Luodi Xie
    Yinyin Xiao
    Applied Intelligence, 2023, 53 : 19522 - 19531
  • [3] Contrastive Graph Convolutional Networks with adaptive augmentation for text classification
    Yang, Yintao
    Miao, Rui
    Wang, Yili
    Wang, Xin
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [4] Graph-based Text Classification by Contrastive Learning with Text-level Graph Augmentation
    Li, Ximing
    Wang, Bing
    Wang, Yang
    Wang, Meng
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (04)
  • [5] Heterogeneous graph contrastive learning with adaptive data augmentation for semi-supervised short text classification
    Wu, Mingqiang
    Xu, Zhuoming
    Zheng, Lei
    EXPERT SYSTEMS, 2025, 42 (02)
  • [6] Features based adaptive augmentation for graph contrastive learning
    Ali, Adnan
    Li, Jinlong
    DIGITAL SIGNAL PROCESSING, 2024, 145
  • [7] Graph Contrastive Learning with Adaptive Augmentation
    Zhu, Yanqiao
    Xu, Yichen
    Yu, Feng
    Liu, Qiang
    Wu, Shu
    Wang, Liang
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 2069 - 2080
  • [8] ConKgPrompt: Contrastive Sample Method Based on Knowledge-Guided Prompt Learning for Text Classification
    Wang, Qian
    Zeng, Cheng
    Li, Bing
    He, Peng
    ELECTRONICS, 2023, 12 (17)
  • [9] Adaptive Graph Augmentation for Graph Contrastive Learning
    Wang, Zeming
    Li, Xiaoyang
    Wang, Rui
    Zheng, Changwen
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT IV, 2023, 14089 : 354 - 366
  • [10] AHCL-TC: Adaptive Hypergraph Contrastive Learning Networks for Text Classification
    Zhang, Zhen
    Ni, Hao
    Jia, Xiyuan
    Su, Fangfang
    Liu, Mengqiu
    Yun, Wenhao
    Wu, Guohua
    NEUROCOMPUTING, 2024, 597