ALSI-Transformer: Transformer-Based Code Comment Generation With Aligned Lexical and Syntactic Information

被引:1
|
作者
Park, Youngmi [1 ]
Park, Ahjeong [1 ]
Kim, Chulyun [1 ]
机构
[1] Sookmyung Womens Univ, Dept Informat Technol Engn, Seoul 04310, South Korea
关键词
Codes; Source coding; Syntactics; Data mining; Transformers; Machine translation; Logic gates; Program comprehension; comment generation; natural language processing; deep learning;
D O I
10.1109/ACCESS.2023.3268638
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Code comments explain the operational process of a computer program and increase the long-term productivity of programming tasks such as debugging and maintenance. Therefore, developing methods that automatically generate natural language comments from programming code is required. With the development of deep learning, various excellent models in the natural language processing domain have been applied for comment generation tasks, and recent studies have improved performance by simultaneously using the lexical information of the code token and the syntactical information obtained from the syntax tree. In this paper, to improve the accuracy of automatic comment generation, we introduce a novel syntactic sequence, Code-Aligned Type sequence (CAT), to align the order and length of lexical and syntactic information, and we propose a new neural network model, Aligned Lexical and Syntactic information-Transformer (ALSI-Transformer), based on a transformer that encodes the aligned multi-modal information with convolution and embedding aggregation layers. Through in-depth experiments, we compared ALSI-Transformer with current baseline methods using standard machine translation metrics and demonstrate that the proposed method achieves state-of-the-art performance in code comment generation.
引用
收藏
页码:39037 / 39047
页数:11
相关论文
共 50 条
  • [41] The interactive reading task: Transformer-based automatic item generation
    Attali, Yigal
    Runge, Andrew
    LaFlair, Geoffrey T.
    Yancey, Kevin
    Goodwin, Sarah
    Park, Yena
    von Davier, Alina A.
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2022, 5
  • [42] A transformer-based approach for improving app review response generation
    Zhang, Weizhe
    Gu, Wenchao
    Gao, Cuiyun
    Lyu, Michael R.
    SOFTWARE-PRACTICE & EXPERIENCE, 2023, 53 (02): : 438 - 454
  • [43] Transformer-based protein generation with regularized latent space optimization
    Egbert Castro
    Abhinav Godavarthi
    Julian Rubinfien
    Kevin Givechian
    Dhananjay Bhaskar
    Smita Krishnaswamy
    Nature Machine Intelligence, 2022, 4 : 840 - 851
  • [44] Understanding the Robustness of Transformer-Based Code Intelligence via Code Transformation: Challenges and Opportunities
    Li, Yaoxian
    Qi, Shiyi
    Gao, Cuiyun
    Peng, Yun
    Lo, David
    Lyu, Michael R.
    Xu, Zenglin
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2025, 51 (02) : 521 - 547
  • [45] Sparse Transformer-Based Sequence Generation for Visual Object Tracking
    Tian, Dan
    Liu, Dong-Xin
    Wang, Xiao
    Hao, Ying
    IEEE ACCESS, 2024, 12 : 154418 - 154425
  • [46] Synthetic seismocardiogram generation using a transformer-based neural network
    Nikbakht, Mohammad
    Gazi, Asim H.
    Zia, Jonathan
    An, Sungtae
    Lin, David J.
    Inan, Omer T.
    Kamaleswaran, Rishikesan
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2023, 30 (07) : 1266 - 1273
  • [47] A Transformer-Based Model for Multi-Track Music Generation
    Jin, Cong
    Wang, Tao
    Liu, Shouxun
    Tie, Yun
    Li, Jianguang
    Li, Xiaobing
    Lui, Simon
    INTERNATIONAL JOURNAL OF MULTIMEDIA DATA ENGINEERING & MANAGEMENT, 2020, 11 (03): : 36 - 54
  • [48] Transformer-based protein generation with regularized latent space optimization
    Castro, Egbert
    Godavarthi, Abhinav
    Rubinfien, Julian
    Givechian, Kevin
    Bhaskar, Dhananjay
    Krishnaswamy, Smita
    NATURE MACHINE INTELLIGENCE, 2022, 4 (10) : 840 - 851
  • [49] TTVAE: Transformer-based generative modeling for tabular data generation
    Wang, Alex X.
    Nguyen, Binh P.
    ARTIFICIAL INTELLIGENCE, 2025, 340
  • [50] Transformer-based Generation of Confrontation Network in Digital Art Applications
    Meng, Huiping
    Gao, Feng
    Li, Dong
    Liu, Yue
    Xu, Jianhui
    Wang, Mengjiao
    Yang, Jian
    PROCEEDINGS OF 2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND DIGITAL APPLICATIONS, MIDA2024, 2024, : 862 - 867