Application of Seq2Seq Models on Code Correction

被引:6
|
作者
Huang, Shan [1 ]
Zhou, Xiao [2 ]
Chin, Sang [2 ,3 ,4 ]
机构
[1] Boston Univ, Dept Phys, 590 Commonwealth Ave, Boston, MA 02215 USA
[2] Boston Univ, Dept Comp Sci, Boston, MA 02215 USA
[3] MIT, Dept Brain & Cognit Sci, Boston, MA USA
[4] Harvard Univ, Ctr Math Sci & Applicat, Boston, MA 02115 USA
来源
FRONTIERS IN ARTIFICIAL INTELLIGENCE | 2021年 / 4卷
基金
美国国家科学基金会;
关键词
programming language correction; seq2seq architecture; pyramid encoder; attention mechanism; transfer learning;
D O I
10.3389/frai.2021.590215
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We apply various seq2seq models on programming language correction tasks on Juliet Test Suite for C/C++ and Java of Software Assurance Reference Datasets and achieve 75% (for C/C++) and 56% (for Java) repair rates on these tasks. We introduce pyramid encoder in these seq2seq models, which significantly increases the computational efficiency and memory efficiency, while achieving similar repair rate to their nonpyramid counterparts. We successfully carry out error type classification task on ITC benchmark examples (with only 685 code instances) using transfer learning with models pretrained on Juliet Test Suite, pointing out a novel way of processing small programming language datasets.
引用
收藏
页数:13
相关论文
共 33 条
  • [21] Short-term power load forecasting based on Seq2Seq model integrating Bayesian optimization, temporal convolutional network and attention
    Dai, Yeming
    Yu, Weijie
    APPLIED SOFT COMPUTING, 2024, 166
  • [22] A Semantic-Embedding Model-Driven Seq2Seq Method for Domain-Oriented Entity Linking on Resource-Restricted Devices
    Inan, Emrah
    Dikenelli, Oguz
    INTERNATIONAL JOURNAL ON SEMANTIC WEB AND INFORMATION SYSTEMS, 2021, 17 (03) : 73 - 87
  • [23] A Novel Long Short-Term Memory Seq2Seq Model with Chaos-Based Optimization and Attention Mechanism for Enhanced Dam Deformation Prediction
    Wang, Lei
    Wang, Jiajun
    Tong, Dawei
    Wang, Xiaoling
    BUILDINGS, 2024, 14 (11)
  • [24] State-of-charge estimation hybrid method for lithium-ion batteries using BiGRU and AM co-modified Seq2Seq network and H-infinity filter
    Kuang, Pan
    Zhou, Fei
    Xu, Shuai
    Li, Kangqun
    Xu, Xiaobin
    ENERGY, 2024, 300
  • [25] Optimized EWT-Seq2Seq-LSTM with Attention Mechanism to Insulators Fault Prediction
    Klaar, Anne Carolina Rodrigues
    Stefenon, Stefano Frizzo
    Seman, Laio Oriel
    Mariani, Viviana Cocco
    Coelho, Leandro dos Santos
    SENSORS, 2023, 23 (06)
  • [26] Forcing-Seq2Seq Model: An Automatic Model of Title Generation for Natural Text Using Deep Learning
    Thuan Nguyen Thi Hiep
    Nhan To Thanh
    Tho Quan Thanh
    PROCEEDINGS OF THE FUTURE TECHNOLOGIES CONFERENCE (FTC) 2021, VOL 2, 2022, 359 : 388 - 402
  • [27] Wavelet-Seq2Seq-LSTM with attention for time series forecasting of level of dams in hydroelectric power plants
    Stefenon, Stefano Frizzo
    Seman, Laio Oriel
    Aquino, Luiza Scapinello
    Coelho, Leandro dos Santos
    ENERGY, 2023, 274
  • [28] Lane Change Trajectory Prediction of Vehicles in Highway Interweaving Area Using Seq2Seq-attention Network
    Han H.
    Xie T.
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2020, 33 (06): : 106 - 118
  • [29] A general deep learning framework for history-dependent response prediction based on UA-Seq2Seq model
    Wang, Chen
    Xu, Li-yan
    Fan, Jian-sheng
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2020, 372
  • [30] Short-time multi-energy load forecasting method based on CNN-Seq2Seq model with attention mechanism
    Zhang, Ge
    Bai, Xiaoqing
    Wang, Yuxuan
    MACHINE LEARNING WITH APPLICATIONS, 2021, 5