Refactoring Inspection Support for Manual Refactoring Edits

被引:17
作者
Alves, Everton L. G. [1 ]
Song, Myoungkyu [2 ]
Massoni, Tiago [3 ]
Machado, Patricia D. L. [4 ]
Kim, Miryung [5 ]
机构
[1] Univ Fed Campina Grande, Dept Comp Sci, BR-58109900 Campina Grande, Brazil
[2] Univ Nebraska Omaha, Dept Comp Sci, Omaha, NE 68182 USA
[3] Univ Fed Campina Grande, Dept Comp Sci, BR-58429900 Campina Grande, PB, Brazil
[4] Univ Fed Campina Grande, Syst & Comp Dept, BR-58109900 Campina Grande, Brazil
[5] Univ Calif Los Angeles, Dept Comp Sci, Los Angeles, CA 90095 USA
基金
美国国家科学基金会;
关键词
Refactoring; refactoring anomalies; code inspection; IDENTIFICATION;
D O I
10.1109/TSE.2017.2679742
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Refactoring is commonly performed manually, supported by regression testing, which serves as a safety net to provide confidence on the edits performed. However, inadequate test suites may prevent developers from initiating or performing refactorings. We propose REFDISTILLER, a static analysis approach to support the inspection of manual refactorings. It combines two techniques. First, it applies predefined templates to identify potential missed edits during manual refactoring. Second, it leverages an automated refactoring engine to identify extra edits that might be incorrect. REFDISTILLER also helps determine the root cause of detected anomalies. In our evaluation, REFDISTILLER identifies 97 percent of seeded anomalies, of which 24 percent are not detected by generated test suites. Compared to running existing regression test suites, it detects 22 times more anomalies, with 94 percent precision on average. In a study with 15 professional developers, the participants inspected problematic refactorings with REFDISTILLER versus testing only. With REFDISTILLER, participants located 90 percent of the seeded anomalies, while they located only 13 percent with testing. The results show REFDISTILLER can help check the correctness of manual refactorings.
引用
收藏
页码:365 / 383
页数:19
相关论文
共 49 条
[1]  
Ackerman A. F., 1984, Software Validation, Inspection - Testing - Verification - Alternatives. Proceedings of the Symposium, P13
[2]   SOFTWARE INSPECTIONS - AN EFFECTIVE VERIFICATION PROCESS [J].
ACKERMAN, AF ;
BUCHWALD, LS ;
LEWSKI, FH .
IEEE SOFTWARE, 1989, 6 (03) :31-36
[3]  
Alves E., 2014, P INT S FDN SOFTW EN
[4]  
Alves E. L., 2016, ADDITIONAL ARTIFACTS
[5]   Test coverage of impacted code elements for detecting refactoring faults: An exploratory study [J].
Alves, Everton L. G. ;
Massoni, Tiago ;
Machado, Patricia Duarte de Lima .
JOURNAL OF SYSTEMS AND SOFTWARE, 2017, 123 :223-238
[6]   Test Coverage and Impact Analysis for Detecting Refactoring Faults: A Study on the Extract Method Refactoring [J].
Alves, Everton L. G. ;
Massoni, Tiago ;
Machado, Patricia D. L. .
30TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, VOLS I AND II, 2015, :1534-1540
[7]   Prioritizing test cases for early detection of refactoring faults [J].
Alves, Everton L. G. ;
Machado, Patricia D. L. ;
Massoni, Tiago ;
Kim, Miryung .
SOFTWARE TESTING VERIFICATION & RELIABILITY, 2016, 26 (05) :402-426
[8]  
[Anonymous], 2007, P 6 JOINT M EUR SOFT
[9]  
Bavota G., 2012, 2012 12th IEEE Working Conference on Source Code Analysis and Manipulation (SCAM 2012), P104, DOI 10.1109/SCAM.2012.20
[10]   Sound refactorings [J].
Cornelio, Marcio ;
Cavalcanti, Ana ;
Sampaio, Augusto .
SCIENCE OF COMPUTER PROGRAMMING, 2010, 75 (03) :106-133