Towards Comprehensive Assessment of Code Quality at CS1-level: Tools, Rubrics and Refactoring Rules

被引:1
作者
Izu, Cruz [1 ]
Mirolo, Claudio [2 ]
机构
[1] Univ Adelaide, Adelaide, SA, Australia
[2] Univ Udine, Udine, Italy
来源
2024 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE, EDUCON 2024 | 2024年
关键词
CS1; code quality; linters; code smells;
D O I
10.1109/EDUCON60312.2024.10578672
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
While most student code is assessed for correctness and functionality, recent work has looked at extending automatic assessment to include quality aspects. In software engineering code reviews help developers to increase the quality of a project by identifying and cleaning poor structures - commonly referred to as code smells. Despite the availability of professional tools, evaluating the quality of small programs at CS1 level is quite different from evaluating a complex software system. Thus, identifying meaningful quality criteria for small programs written by novices and either adapting current tools or designing new ones for that purpose are topics worth being investigated. The present work contributes to this aim by analysing the code produced by CS1 students from three different perspectives: (i) inspecting the feedback of automated tools - Hyperstyle and Pylint; (ii) matching the smells addressed by a set of refactoring rules; (iii) devising and using a manual rubric. A comparative analysis indeed highlights strengths and weaknesses of these approaches. Overall, automatic quality feedback needs to be complemented with classroom instruction to manually detect code issues and decide if they need refactoring. Additionally, such review activities have the potential to develop code comprehension by engaging novice programmers to reflect on their own code.
引用
收藏
页数:10
相关论文
共 35 条
[1]   Reflections on Teaching Refactoring: A Tale of Two Projects [J].
Abid, Shamsa ;
Basit, Hamid Abdul ;
Arshad, Naveed .
ITICSE'15: PROCEEDINGS OF THE 2015 ACM CONFERENCE ON INNOVATION AND TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, 2015, :225-230
[2]   Sometimes It's Just Sloppiness - Studying Students' Programming Errors and Misconceptions [J].
Albrecht, Ella ;
Grabowski, Jens .
SIGCSE 2020: PROCEEDINGS OF THE 51ST ACM TECHNICAL SYMPOSIUM ON COMPUTER SCIENCE EDUCATION, 2020, :340-345
[3]  
[Anonymous], 2014, ACE 14
[4]  
Araujo E., 2016, 2016 IEEE FRONT ED C, P1
[5]   Hyperstyle: A Tool for Assessing the Code Quality of Solutions to Programming Assignments [J].
Birillo, Anastasiia ;
Vlasov, Ilya ;
Burylov, Artyom ;
Selishchev, Vitalii ;
Goncharov, Artyom ;
Tikhomirova, Elena ;
Vyahhi, Nikolay ;
Bryksin, Timofey .
PROCEEDINGS OF THE 53RD ACM TECHNICAL SYMPOSIUM ON COMPUTER SCIENCE EDUCATION (SIGCSE 2022), VOL 1, 2022, :307-313
[6]   FrenchPress Gives Students Automated Feedback on Java']Java Program Flaws [J].
Blau, Hannah ;
Moss, J. Eliot B. .
ITICSE'15: PROCEEDINGS OF THE 2015 ACM CONFERENCE ON INNOVATION AND TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, 2015, :15-20
[7]   "I know it when I see it" - Perceptions of Code Quality ITiCSE'17 Working Group Report [J].
Borstler, Jurgen ;
Stoerrle, Harald ;
Toll, Daniel ;
van Assema, Jelle ;
Duran, Rodrigo ;
Hooshangi, Sara ;
Jeuring, Johan ;
Keuning, Hieke ;
Kleiner, Carsten ;
MacKellar, Bonnie .
ITICSE-WGR'17: PROCEEDINGS OF THE 2017 ITICSE CONFERENCE WORKING GROUP REPORTS, 2017, :70-85
[8]  
Breuker D.M., 2011, P 16 ANN JOINT C INN, P13, DOI [10.1145/1999747.1999754, DOI 10.1145/1999747.1999754, 10.1145/1999747.1999754.]
[9]  
Cardell-Oliver Rachel., 2011, ACE 11, V114, P55
[10]  
Edwards SH, 2019, PROCEEDINGS OF THE 52ND ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, P7825