A Novel Source Code Representation Approach Based on Multi-Head Attention

被引:0
作者
Xiao, Lei [1 ]
Zhong, Hao [1 ]
Liu, Jianjian [1 ]
Zhang, Kaiyu [1 ]
Xu, Qizhen [1 ]
Chang, Le [2 ]
机构
[1] Xiamen Univ Technol, Coll Comp & Informat Engn, Xiamen 361024, Peoples R China
[2] Software Secur Co, Chengdu 610041, Peoples R China
关键词
multi-head attention; code clone; code classification; source code representation; CLONE DETECTION;
D O I
10.3390/electronics13112111
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Code classification and code clone detection are crucial for understanding and maintaining large software systems. Although deep learning surpasses traditional techniques in capturing the features of source code, existing models suffer from low processing power and high complexity. We propose a novel source code representation method based on the multi-head attention mechanism (SCRMHA). SCRMHA captures the vector representation of entire code segments, enabling it to focus on different positions of the input sequence, capture richer semantic information, and simultaneously process different aspects and relationships of the sequence. Moreover, it can calculate multiple attention heads in parallel, speeding up the computational process. We evaluate SCRMHA on both the standard dataset and an actual industrial dataset, and analyze the differences between these two datasets. Experiment results in code classification and clone detection tasks show that SCRMHA consumes less time and reduces complexity by about one-third compared with traditional source code feature representation methods. The results demonstrate that SCRMHA reduces the computational complexity and time consumption of the model while maintaining accuracy.
引用
收藏
页数:22
相关论文
共 50 条
  • [31] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    ELECTRONICS, 2023, 12 (19)
  • [32] Implementation and Application of Violence Detection System Based on Multi-head Attention and LSTM
    Cao, Fengping
    Miao, Yi
    Zhang, Wangyi
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VII, ICIC 2024, 2024, 14868 : 77 - 88
  • [33] Interactive Selection Recommendation Based on the Multi-head Attention Graph Neural Network
    Zhang, Shuxi
    Chen, Jianxia
    Yao, Meihan
    Wu, Xinyun
    Ge, Yvfan
    Li, Shu
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT III, 2024, 14449 : 447 - 458
  • [34] Analyzing and Controlling Inter-Head Diversity in Multi-Head Attention
    Yun, Hyeongu
    Kang, Taegwan
    Jung, Kyomin
    APPLIED SCIENCES-BASEL, 2021, 11 (04): : 1 - 14
  • [35] Trajectory Prediction in Complex Scenes Based on Multi-Head Attention Adversarial Mechanism
    Yu L.
    Li H.-Y.
    Jiao C.-L.
    Leng Y.-F.
    Xu G.-Y.
    Jisuanji Xuebao/Chinese Journal of Computers, 2022, 45 (06): : 1133 - 1146
  • [36] MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention
    Lu, Yiwei
    Yang, Ruopeng
    Jiang, Xuping
    Zhou, Dan
    Yin, Changsheng
    Li, Zizhuo
    SYMMETRY-BASEL, 2021, 13 (09):
  • [37] Remaining Useful Life Prediction of Aeroengines Based on Multi-Head Attention Mechanism
    Nie, Lei
    Xu, Shiyi
    Zhang, Lvfan
    Yin, Yehan
    Dong, Zhengqiong
    Zhou, Xiangdong
    MACHINES, 2022, 10 (07)
  • [38] Speech enhancement method based on the multi-head self-attention mechanism
    Chang X.
    Zhang Y.
    Yang L.
    Kou J.
    Wang X.
    Xu D.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [39] ADDGCN: A Novel Approach with Down-Sampling Dynamic Graph Convolution and Multi-Head Attention for Traffic Flow Forecasting
    Li, Zuhua
    Wei, Siwei
    Wang, Haibo
    Wang, Chunzhi
    APPLIED SCIENCES-BASEL, 2024, 14 (10):
  • [40] Generating Patent Text Abstracts Based on Improved Multi-head Attention Mechanism
    Guoliang S.
    Shu Z.
    Yunfeng W.
    Chunjiang S.
    Liang L.
    Data Analysis and Knowledge Discovery, 2023, 7 (06) : 61 - 72