Does Creating Programming Assignments with Tests Lead to Improved Performance in Writing Unit Tests?

Vilma Kangas, Nea Pirttinen, Henrik Nygren, Juho Leinonen, Arto Hellas

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


We have constructed a tool, CrowdSorcerer, in which students create programming assignments, their model solutions and associated test cases using a simple input-output format. We have used the tool as a part of an introductory programming course with normal course activities such as programming assignments and a final exam. In our work, we focus on whether creating programming assignments and associated tests correlate with students' performance in a testing-related exam question. We study this through an analysis of the quality of student-written tests within the tool, measured using the number of test cases, line coverage and mutation coverage, and students' performance in testing related exam question, measured using exam points. Finally, we study whether previous programming experience correlates with how students act within the tool and within the testing related exam question.
Original languageEnglish
Title of host publicationCompEd '19 : Proceedings of the ACM Conference on Global Computing Education
Number of pages7
Place of PublicationNew York
Publication date9 May 2019
ISBN (Electronic)978-1-4503-6259-7
Publication statusPublished - 9 May 2019
MoE publication typeA4 Article in conference proceedings
EventACM Global Computing Education Conference: CompEd '19 - Chengdu, China
Duration: 17 May 201919 May 2019
Conference number: 1

Fields of Science

  • 113 Computer and information sciences

Cite this