Spoken language understanding via graph contrastive learning on the context-aware graph convolutional network

被引:0
作者
Cao, Ze [1 ]
Liu, Jian-Wei [1 ]
机构
[1] China Univ Petr, Coll Artificial Intelligence, Dept Automat, 260 Mailbox, Beijing 102249, Peoples R China
关键词
Spoken language comprehension; Graph contrastive learning; Intent detection; Conversation behavior recognition; Slot filling;
D O I
10.1007/s10044-024-01362-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A spoken language understanding system is a crucial component of a dialogue system whose task is to comprehend the user's verbal expressions and perform the corresponding tasks accordingly. Contextual spoken language understanding (contextual SLU) is an extremely critical issue on this field as it helps the system to understand the user's verbal expressions more accurately, thus improving the system's performance and accuracy. The aim of this paper is to enhance the effectiveness of contextual SLU analysis. Context-based language unit systems are mainly concerned with effectively integrating dialog context information. Current approaches usually use the same contextual information to guide the slot filling of all tokens, which may introduce irrelevant information and lead to comprehension bias and ambiguity. To solve this problem, we apply the principle of graph contrastive learning based on the graph convolutional network to enhance the model's ability to aggregate contextual information. Simultaneously, applying graph contrastive learning can enhance the model's effectiveness by strengthening its intention. More precisely, graph convolutional networks can consider contextual information and automatically aggregate contextual information, allowing the model to no longer rely on traditionally designed heuristic aggregation functions. The contrastive learning module utilizes the principle of contrastive learning to achieve the effect of intention enhancement, which can learn deeper semantic information and contextual relationships, and improve the model's effectiveness in three key tasks: slot filling, dialogue action recognition, and intention detection. Experiments on a synthetic dialogue dataset show that our model achieves state-of-the-art performance and significantly outperforms other previous approaches (Slot F1 values + 1.03% on Sim-M, + 2.32% on Sim-R; Act F1 values + 0.26% on Sim-M, + 0.56% on Sim-R; Frame Acc values + 3.15% on Sim-M, + 1.62% on Sim-R). The code is available at: https://github.com/caoze1228/ACIUGCL-CSLU.
引用
收藏
页数:21
相关论文
共 50 条
[21]   Distribution-aware hybrid noise augmentation in graph contrastive learning for recommendation [J].
Zhu, Kuiyu ;
Qin, Tao ;
Wang, Xin ;
Liu, Zhaoli ;
Wang, Chenxu .
EXPERT SYSTEMS WITH APPLICATIONS, 2024, 124
[22]   Temporality- and Frequency-aware Graph Contrastive Learning for Temporal Networks [J].
Tan, Shiyin ;
You, Jingyi ;
Li, Dongyuan .
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, :1878-1888
[23]   FeCoGraph: Label-Aware Federated Graph Contrastive Learning for Few-Shot Network Intrusion Detection [J].
Mao, Qinghua ;
Lin, Xi ;
Xu, Wenchao ;
Qi, Yuxin ;
Su, Xiu ;
Li, Gaolei ;
Li, Jianhua .
IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2025, 20 :2266-2280
[24]   Dynamic heterogeneous graph representation based on adaptive negative sample mining via high-fidelity instances and context-aware uncertainty learning [J].
Bai, Wenhao ;
Qiu, Liqing ;
Zhao, Weidong .
EXPERT SYSTEMS WITH APPLICATIONS, 2025, 278
[25]   Boosting graph contrastive learning via adaptive graph augmentation and topology-feature-level homophily [J].
Sun, Shuo ;
Zhao, Zhongying ;
Liu, Gen ;
Zhang, Qiqi ;
Su, Lingtao .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024,
[26]   JGC-IAGCL: Fusing joint graph convolution and intent-aware graph contrastive learning for explainable recommendation [J].
Yang, Zhi ;
Lin, Chuan ;
Qin, Yongbin ;
Huang, Ruizhang ;
Chen, Yanping ;
Qin, Jiwei .
INFORMATION FUSION, 2025, 123
[27]   Knowledge-enhanced prototypical network with graph structure and semantic information interaction for low-shot joint spoken language understanding [J].
Du, Kunpeng ;
Zhang, Xuan ;
Gao, Chen ;
Shang, Weiyi ;
Ma, Yubin ;
Jin, Zhi ;
Li, Linyu .
EXPERT SYSTEMS WITH APPLICATIONS, 2025, 280
[28]   GNNCL: A Graph Neural Network Recommendation Model Based on Contrastive Learning [J].
Chen, Jinguang ;
Zhou, Jiahe ;
Ma, Lili .
NEURAL PROCESSING LETTERS, 2024, 56 (02)
[29]   Attraction and Repulsion: Unsupervised Domain Adaptive Graph Contrastive Learning Network [J].
Wu, Man ;
Pan, Shirui ;
Zhu, Xingquan .
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (05) :1079-1091
[30]   GNNCL: A Graph Neural Network Recommendation Model Based on Contrastive Learning [J].
Jinguang Chen ;
Jiahe Zhou ;
Lili Ma .
Neural Processing Letters, 56