PerfJIT: Test-Level Just-in-Time Prediction for Performance Regression Introducing Commits

被引:22
|
作者
Chen, Jinfu [1 ]
Shang, Weiyi [1 ]
Shihab, Emad [1 ]
机构
[1] Concordia Univ, Dept Comp Sci & Software Engn, Montreal, PQ H3G 1M8, Canada
关键词
Measurement; Predictive models; Software; Task analysis; Benchmark testing; Logistics; Performance regression; software performance; software quality; mining software repositories; empirical software engineering; DEFECT PREDICTION; SOFTWARE; FAULTS;
D O I
10.1109/TSE.2020.3023955
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Performance issues may compromise user experiences, increase the cost resources, and cause field failures. One of the most prevalent performance issues is performance regression. Due to the importance and challenges in performance regression detection, prior research proposes various automated approaches that detect performance regressions. However, the performance regression detection is conducted after the system is built and deployed. Hence, large amounts of resources are still required to locate and fix performance regressions. In our paper, we propose an approach that automatically predicts whether a test would manifest performance regressions given a code commit. In particular, we extract both traditional metrics and performance-related metrics from the code changes that are associated with each test. For each commit, we build random forest classifiers that are trained from all prior commits to predict in this commit whether each test would manifest performance regression. We conduct case studies on three open-source systems (Hadoop, Cassandra, and OpenJPA). Our results show that our approach can predict tests that manifest performance regressions in a commit with high AUC values (on average 0.86). Our approach can drastically reduce the testing time needed to detect performance regressions. In addition, we find that our approach could be used to detect the introduction of six out of nine real-life performance issues from the subject systems during our studied period. Finally, we find that traditional metrics that are associated with size and code change histories are the most important factors in our models. Our approach and the study results can be leveraged by practitioners to effectively cope with performance regressions in a timely and proactive manner.
引用
收藏
页码:1529 / 1544
页数:16
相关论文
共 50 条
  • [21] An Empirical Study on Just-in-time Conformal Defect Prediction
    Shahini, Xhulja
    Metzger, Andreas
    Pohl, Klaus
    2024 IEEE/ACM 21ST INTERNATIONAL CONFERENCE ON MINING SOFTWARE REPOSITORIES, MSR, 2024, : 88 - 99
  • [22] Just-in-time defect prediction for mobile applications: using shallow or deep learning?
    van Dinter, Raymon
    Catal, Cagatay
    Giray, Goerkem
    Tekinerdogan, Bedir
    SOFTWARE QUALITY JOURNAL, 2023, 31 (04) : 1281 - 1302
  • [23] Graph-based machine learning improves just-in-time defect prediction
    Bryan, Jonathan
    Moriano, Pablo
    PLOS ONE, 2023, 18 (04):
  • [24] A Replication Study: Just-In-Time Defect Prediction with Ensemble Learning
    Young, Steven
    Abdou, Tamer
    Bener, Ayse
    2018 IEEE/ACM 6TH INTERNATIONAL WORKSHOP ON REALIZING ARTIFICIAL INTELLIGENCE SYNERGIES IN SOFTWARE ENGINEERING (RAISE), 2018, : 42 - 47
  • [25] Fine-Grained Just-In-Time Defect Prediction at the Block Level in Infrastructure-as-Code (IaC)
    Begoug, Mahi
    Chouchen, Moataz
    Ouni, Ali
    AlOmar, Eman Abdullah
    Mkaouer, Mohamed Wiem
    2024 IEEE/ACM 21ST INTERNATIONAL CONFERENCE ON MINING SOFTWARE REPOSITORIES, MSR, 2024, : 100 - 112
  • [26] FENSE: A feature-based ensemble modeling approach to cross-project just-in-time defect prediction
    Zhang, Tanghaoran
    Yu, Yue
    Mao, Xinjun
    Lu, Yao
    Li, Zhixing
    Wang, Huaimin
    EMPIRICAL SOFTWARE ENGINEERING, 2022, 27 (07)
  • [27] Watch Out for Extrinsic Bugs! A Case Study of Their Impact in Just-In-Time Bug Prediction Models on the OpenStack Project
    Rodriguez-Perez, Gema
    Nagappan, Meiyappan
    Robles, Gregorio
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2022, 48 (04) : 1400 - 1416
  • [28] Supplier selection and performance evaluation in just-in-time production environments
    Aksoy, Asli
    Ozturk, Nursel
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (05) : 6351 - 6359
  • [29] Just-In-Time Defect Prediction on Java']JavaScript Projects: A Replication Study
    Ni, Chao
    Xia, Xin
    Lo, David
    Yang, Xiaohu
    Hassan, Ahmed E.
    ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, 2022, 31 (04)
  • [30] Simplified Deep Forest Model Based Just-in-Time Defect Prediction for Android Mobile Apps
    Zhao, Kunsong
    Xu, Zhou
    Zhang, Tao
    Tang, Yutian
    Yan, Meng
    IEEE TRANSACTIONS ON RELIABILITY, 2021, 70 (02) : 848 - 859