CGraphNet: Contrastive Graph Context Prediction for Sparse Unlabeled Short Text Representation Learning on Social Media

被引:0
作者
Chen, Junyang [1 ]
Guo, Jingcai [2 ]
Li, Xueliang [3 ]
Wang, Huan [4 ]
Xu, Zhenghua [5 ]
Gong, Zhiguo [6 ]
Zhang, Liangjie [1 ]
Leung, Victor C. M. [1 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Hong Kong Polytech Univ, Dept Comp, Hong Kong 999077, Peoples R China
[3] Shenzhen Univ, Natl Engn Lab Big Data Syst Comp Technol, Shenzhen 518060, Peoples R China
[4] Huazhong Agr Univ, Coll Informat, Wuhan 430070, Peoples R China
[5] Hebei Univ Technol, State Key Lab Reliabil & Intelligence Elect Equipm, Tianjin 300401, Peoples R China
[6] Univ Macau, Dept Comp Informat Sci, State Key Lab Internet Things Smart City, Macau 999078, Peoples R China
基金
中国国家自然科学基金;
关键词
Representation learning; Context modeling; Predictive models; Social networking (online); Encoding; Bidirectional control; Recurrent neural networks; Long short term memory; Transformers; Probabilistic logic; Contrastive graph context prediction; sequential learning; social media short text representation learning; sparsity problem; text mining;
D O I
10.1109/TCSS.2024.3452695
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Unlabeled text representation learning (UTRL), encompassing static word embeddings such as Word2Vec and contextualized word embeddings such as bidirectional encoder representations from transformer (BERT), aims to capture semantic word relationships in a low-dimensional space without the need for manual labeling. These word embeddings are invaluable for downstream tasks such as document classification and clustering. However, the surge of short texts generated daily on social media platforms results in sparse word cooccurrences, compromising UTRL outcomes. Contextualized models such as recurrent neural network (RNN) and BERT, while impressive, often struggle with predicting the next word due to sparse word sequences in short texts. To address this, we introduce CGraphNet, a contrastive graph context prediction model designed for UTRL. This approach converts short texts into graphs, establishing links between sequentially occurring words. Information from the next word and its neighbors informs the target prediction, a process referred to as graph context prediction, mitigating sparse word cooccurrence issues in brief sentences. To minimize noise, an attention mechanism assigns importance to neighbors, while a contrastive objective encourages more distinctive representations by comparing the target word with its neighbors. Our experiments demonstrate CGraphNet's superior performance over other baselines, particularly in classification and clustering tasks on real-world datasets.
引用
收藏
页数:15
相关论文
共 44 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]  
[Anonymous], 2013, NeurIPS, DOI DOI 10.48550/ARXIV.1310.4546
[3]   Textual One-Pass Stream Clustering with Automated Distance Threshold Adaption [J].
Assenmacher, Dennis ;
Trautmann, Heike .
INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, PT I, 2022, 13757 :3-16
[4]  
Ba J, 2014, ACS SYM SER
[5]  
Banerjee Somnath, 2007, 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, P787, DOI 10.1145/1277741.1277909
[6]  
Lipton ZC, 2015, Arxiv, DOI arXiv:1506.00019
[7]   Inductive Document Representation Learning for Short Text Clustering [J].
Chen, Junyang ;
Gong, Zhiguo ;
Wang, Wei ;
Dong, Xiao ;
Liu, Weiwen ;
Wang, Cong ;
Chen, Xian .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT III, 2021, 12459 :600-616
[8]   A Dirichlet process biterm-based mixture model for short text stream clustering [J].
Chen, Junyang ;
Gong, Zhiguo ;
Liu, Weiwen .
APPLIED INTELLIGENCE, 2020, 50 (05) :1609-1619
[9]   A nonparametric model for online topic discovery with word embeddings [J].
Chen, Junyang ;
Gong, Zhiguo ;
Liu, Weiwen .
INFORMATION SCIENCES, 2019, 504 :32-47
[10]   BTM: Topic Modeling over Short Texts [J].
Cheng, Xueqi ;
Yan, Xiaohui ;
Lan, Yanyan ;
Guo, Jiafeng .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2014, 26 (12) :2928-2941