Abstract
Instructors of computer programming courses evaluate student progress on code submissions, exams, and other activities. The evaluation of code submissions is typically a summative assessment that gives very little insight into the process the student used when designing and writing the code. Thus, a tool that offers instructors a view into how students actually write their code could have broad impacts on assessment, intervention, instructional design, and plagiarism detection. In this article we propose an interactive software tool with a novel visualization that includes both static and dynamic views of the process that students take to complete computer programming assignments. We report results of an exploratory think-aloud study in which instructors offer thoughts as to the utility and potential of the tool. In the think-aloud study, we observed that the instructors easily identified multiple coding strategies (or the lack of thereof), were able to recognize plagiarism, and noticed a clear need for wider dissemination of tools for visualizing the programming process.
Original language | English |
---|---|
Title of host publication | ACE '22: Australasian Computing Education Conference |
Number of pages | 9 |
Publisher | ACM |
Publication date | 14 Feb 2022 |
Pages | 46-55 |
ISBN (Electronic) | 978-1-4503-9643-1 |
DOIs | |
Publication status | Published - 14 Feb 2022 |
MoE publication type | A4 Article in conference proceedings |
Event | ACE '22: Australasian Computing Education Conference - Duration: 14 Feb 2022 → 18 Feb 2022 https://aceconference.wordpress.com/ |
Fields of Science
- 113 Computer and information sciences