Testing and Debugging Habits of Intermediate Student Programmers

被引:0
作者
Izu, Cruz [1 ]
Weeransinghe, Amali [1 ]
机构
[1] Univ Adelaide, Adelaide, SA, Australia
来源
2024 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE, EDUCON 2024 | 2024年
关键词
Programming; CS2; testing; debugging;
D O I
10.1109/EDUCON60312.2024.10578650
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Testing and debugging studies at undergraduate level have focused on the needs of novice programmers. Novice struggles have been identified and activities have been developed to expose CS1 students to a range of bug types and debugging techniques. As computer science students gain coding experience and complete further programming courses, they are expected to become competent debuggers with limited or no further instruction. However, not all students advance at the same pace and more work is needed to understand the debugging skills of an average CS2 student. This study is focused on exploring the practices and habits developed by those intermediate students to validate this expectation and identify possible gaps that require more support. To become competent debuggers, students should test and debug their code locally before submission instead of relying on the assignment's testing script to report failed test cases. Thus, we designed an online quiz in order to capture student's testing and debugging habits before and after students submit their code to an automatic grading system. The quiz answers from a second-year elective programming subject indicates that the average student used at least 3 techniques for bug investigation, with 55% of students using diagnostic print statements (DPS), tracing of failed test cases, and reading code. Note that "reading code" to check the steps of computation was the only technique to be statistically significant in relation to course performance.
引用
收藏
页数:10
相关论文
共 33 条
[21]  
O DELL, 2017, Queue, V15, no, P71
[22]   A Minimally Disruptive Approach of Integrating Testing into Computer Programming Courses [J].
Ramasamy, Vijayalakshmi ;
Alomari, Hakam W. ;
Kiper, James D. ;
Potvin, Geoffrey .
2018 IEEE/ACM INTERNATIONAL WORKSHOP ON SOFTWARE ENGINEERING EDUCATION FOR MILLENNIALS (SEEM), 2018, :1-7
[23]   A Miss is as Good as a Mile: Off-By-One Errors and Arrays in an Introductory Programming Course [J].
Rigby, Liam ;
Denny, Paul ;
Luxton-Reilly, Andrew .
PROCEEDINGS OF THE TWENTY-SECOND AUSTRALASIAN COMPUTING EDUCATION CONFERENCE, ACE'20, 2020, :31-38
[24]  
Silvis-Cividjian Natalia, 2021, ITiCSE '21: Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education, P171, DOI 10.1145/3430665.3456330
[25]   Gradescope: a Fast, Flexible, and Fair System for Scalable Assessment of Handwritten Work [J].
Singh, Arjun ;
Karayev, Sergey ;
Gutowski, Kevin ;
Abbeel, Pieter .
PROCEEDINGS OF THE FOURTH (2017) ACM CONFERENCE ON LEARNING @ SCALE (L@S'17), 2017, :81-88
[26]   An Automated System for Interactively Learning Software Testing [J].
Smith, Rebecca ;
Tang, Terry ;
Warren, Joe ;
Rixner, Scott .
ITICSE'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INNOVATION AND TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, 2017, :98-103
[27]  
Tremblay G, 2007, ITICSE 2007: 12TH ANNUAL CONFERENCE ON INNOVATION & TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, P176, DOI 10.1145/1269900.1268837
[28]   EXPERTISE IN DEBUGGING COMPUTER-PROGRAMS - AN ANALYSIS OF THE CONTENT OF VERBAL PROTOCOLS [J].
VESSEY, I .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1986, 16 (05) :621-637
[29]   A Think-Aloud Study of Novice Debugging [J].
Whalley, Jacqueline ;
Settle, Amber ;
Luxton-Reilly, Andrew .
ACM TRANSACTIONS ON COMPUTING EDUCATION, 2023, 23 (02)
[30]  
Whalley J, 2021, PROCEEDINGS OF THE 23RD AUSTRALASIAN COMPUTING EDUCATION CONFERENCE, ACE 2021, P11, DOI 10.1145/3441636.3442300