On the "Naturalness" of Buggy Code

被引:163
作者
Ray, Baishakhi [1 ]
Hellendoorn, Vincent [2 ]
Godhane, Saheel [2 ]
Tu, Zhaopeng [3 ]
Bacchelli, Alberto [4 ]
Devanbu, Premkumar [2 ]
机构
[1] Univ Virginia, Charlottesville, VA 22903 USA
[2] Univ Calif Davis, Davis, CA 95616 USA
[3] Huawei Technol Co Ltd, Shenzhen, Guangdong, Peoples R China
[4] Delft Univ Technol, Delft, Netherlands
来源
2016 IEEE/ACM 38TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE) | 2016年
基金
美国国家科学基金会;
关键词
PREDICTING FAULTS;
D O I
10.1145/2884781.2884848
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Real software, the kind working programmers produce by the kLOC to solve real-world problems, tends to be "natural", like speech or natural language; it tends to be highly repetitive and predictable. Researchers have captured this naturalness of software through statistical models and used them to good effect in suggestion engines, porting tools, coding standards checkers, and idiom miners. This suggests that code that appears improbable, or surprising, to a good statistical language model is "unnatural" in some sense, and thus possibly suspicious. In this paper, we investigate this hypothesis. We consider a large corpus of bug fix commits (ca.7,139), from 10 different Java projects, and focus on its language statistics, evaluating the naturalness of buggy code and the corresponding fixes. We find that code with bugs tends to be more entropic (i.e. unnatural), becoming less so as bugs are fixed. Ordering files for inspection by their average entropy yields cost-effectiveness scores comparable to popular defect prediction methods. At a finer granularity, focusing on highly entropic lines is similar in cost-effectiveness to some well-known static bug finders (PMD, FindBugs) and ordering warnings from these bug finders using an entropy measure improves the cost-effectiveness of inspecting code implicated in warnings. This suggests that entropy may be a valid, simple way to complement the effectiveness of PMD or FindBugs, and that search-based bug-fixing methods may benefit from using entropy both for fault-localization and searching for fixes.
引用
收藏
页码:428 / 439
页数:12
相关论文
共 56 条
[21]   Predicting Faults Using the Complexity of Code Changes [J].
Hassan, Ahmed E. .
2009 31ST INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, PROCEEDINGS, 2009, :78-88
[22]  
Heckman S.S., 2007, Crossroads, V14, P7
[23]  
Heckman S, 2008, ESEM'08: PROCEEDINGS OF THE 2008 ACM-IEEE INTERNATIONAL SYMPOSIUM ON EMPIRICAL SOFTWARE ENGINEERING AND MEASUREMENT, P41
[24]  
Herzig K., 2011, UNTANGLING CHA UNPUB
[25]  
Herzig K, 2013, IEEE WORK CONF MIN S, P121, DOI 10.1109/MSR.2013.6624018
[26]  
Hindle A, 2012, PROC INT CONF SOFTW, P837, DOI 10.1109/ICSE.2012.6227135
[27]  
Johnson B, 2013, PROCEEDINGS OF THE 35TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2013), P672, DOI 10.1109/ICSE.2013.6606613
[28]  
Karaivanov Svetoslav, 2014, P 2014 ACM INT S NEW, P173
[29]   ESTIMATION OF PROBABILITIES FROM SPARSE DATA FOR THE LANGUAGE MODEL COMPONENT OF A SPEECH RECOGNIZER [J].
KATZ, SM .
IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1987, 35 (03) :400-401
[30]  
Kim S., 2007, EUROPEAN SOFTWARE EN, P45