Crowdsourcing programming assignments with CrowdSorcerer

Nea Pirttinen, Vilma Kangas, Irene Nikkarinen, Henrik Nygren, Juho Leinonen, Arto Hellas

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Small automatically assessed programming assignments are an often used resource for learning programming. Creating sufficiently large amounts of such assignments is, however, time consuming. As a consequence, offering large quantities of practice assignments to students is not always possible. CrowdSorcerer is an embeddable open-source system that students and teachers alike can use for creating and evaluating small automatically assessed programming assignments. While creating programming assignments, the students also write simple input-output -tests, and are gently introduced to the basics of testing. Students can also evaluate the assignments of others and provide feedback on them, which exposes them to code written by others early in their education. In this article we both describe the CrowdSorcerer system and our experiences in using the system in a large undergraduate programming course. Moreover, we discuss the motivation for crowdsourcing course assignments and present some usage statistics.
Original languageEnglish
Title of host publicationProceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education
Number of pages6
Place of PublicationNew York, NY
PublisherACM
Publication date2 Jul 2018
Pages326-331
ISBN (Electronic)978-1-4503-5707-4
DOIs
Publication statusPublished - 2 Jul 2018
MoE publication typeA4 Article in conference proceedings
Event23rd Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE 2018) - University of Central Lancashire, Cyprus, Larnaca, Cyprus
Duration: 2 Jul 20184 Jul 2018
Conference number: 23
https://iticse.acm.org/

Fields of Science

  • 113 Computer and information sciences

Cite this

Pirttinen, N., Kangas, V., Nikkarinen, I., Nygren, H., Leinonen, J., & Hellas, A. (2018). Crowdsourcing programming assignments with CrowdSorcerer. In Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education (pp. 326-331). New York, NY: ACM. https://doi.org/10.1145/3197091.3197117
Pirttinen, Nea ; Kangas, Vilma ; Nikkarinen, Irene ; Nygren, Henrik ; Leinonen, Juho ; Hellas, Arto. / Crowdsourcing programming assignments with CrowdSorcerer. Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. New York, NY : ACM, 2018. pp. 326-331
@inproceedings{d6d3a01ec0eb4b39a32e82985bb6ee26,
title = "Crowdsourcing programming assignments with CrowdSorcerer",
abstract = "Small automatically assessed programming assignments are an often used resource for learning programming. Creating sufficiently large amounts of such assignments is, however, time consuming. As a consequence, offering large quantities of practice assignments to students is not always possible. CrowdSorcerer is an embeddable open-source system that students and teachers alike can use for creating and evaluating small automatically assessed programming assignments. While creating programming assignments, the students also write simple input-output -tests, and are gently introduced to the basics of testing. Students can also evaluate the assignments of others and provide feedback on them, which exposes them to code written by others early in their education. In this article we both describe the CrowdSorcerer system and our experiences in using the system in a large undergraduate programming course. Moreover, we discuss the motivation for crowdsourcing course assignments and present some usage statistics.",
keywords = "113 Computer and information sciences",
author = "Nea Pirttinen and Vilma Kangas and Irene Nikkarinen and Henrik Nygren and Juho Leinonen and Arto Hellas",
year = "2018",
month = "7",
day = "2",
doi = "10.1145/3197091.3197117",
language = "English",
pages = "326--331",
booktitle = "Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education",
publisher = "ACM",
address = "International",

}

Pirttinen, N, Kangas, V, Nikkarinen, I, Nygren, H, Leinonen, J & Hellas, A 2018, Crowdsourcing programming assignments with CrowdSorcerer. in Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. ACM, New York, NY, pp. 326-331, 23rd Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE 2018), Larnaca, Cyprus, 02/07/2018. https://doi.org/10.1145/3197091.3197117

Crowdsourcing programming assignments with CrowdSorcerer. / Pirttinen, Nea; Kangas, Vilma; Nikkarinen, Irene; Nygren, Henrik; Leinonen, Juho; Hellas, Arto.

Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. New York, NY : ACM, 2018. p. 326-331.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

TY - GEN

T1 - Crowdsourcing programming assignments with CrowdSorcerer

AU - Pirttinen, Nea

AU - Kangas, Vilma

AU - Nikkarinen, Irene

AU - Nygren, Henrik

AU - Leinonen, Juho

AU - Hellas, Arto

PY - 2018/7/2

Y1 - 2018/7/2

N2 - Small automatically assessed programming assignments are an often used resource for learning programming. Creating sufficiently large amounts of such assignments is, however, time consuming. As a consequence, offering large quantities of practice assignments to students is not always possible. CrowdSorcerer is an embeddable open-source system that students and teachers alike can use for creating and evaluating small automatically assessed programming assignments. While creating programming assignments, the students also write simple input-output -tests, and are gently introduced to the basics of testing. Students can also evaluate the assignments of others and provide feedback on them, which exposes them to code written by others early in their education. In this article we both describe the CrowdSorcerer system and our experiences in using the system in a large undergraduate programming course. Moreover, we discuss the motivation for crowdsourcing course assignments and present some usage statistics.

AB - Small automatically assessed programming assignments are an often used resource for learning programming. Creating sufficiently large amounts of such assignments is, however, time consuming. As a consequence, offering large quantities of practice assignments to students is not always possible. CrowdSorcerer is an embeddable open-source system that students and teachers alike can use for creating and evaluating small automatically assessed programming assignments. While creating programming assignments, the students also write simple input-output -tests, and are gently introduced to the basics of testing. Students can also evaluate the assignments of others and provide feedback on them, which exposes them to code written by others early in their education. In this article we both describe the CrowdSorcerer system and our experiences in using the system in a large undergraduate programming course. Moreover, we discuss the motivation for crowdsourcing course assignments and present some usage statistics.

KW - 113 Computer and information sciences

U2 - 10.1145/3197091.3197117

DO - 10.1145/3197091.3197117

M3 - Conference contribution

SP - 326

EP - 331

BT - Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education

PB - ACM

CY - New York, NY

ER -

Pirttinen N, Kangas V, Nikkarinen I, Nygren H, Leinonen J, Hellas A. Crowdsourcing programming assignments with CrowdSorcerer. In Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. New York, NY: ACM. 2018. p. 326-331 https://doi.org/10.1145/3197091.3197117