RMIT University
Browse

Assessment of programming: pedagogical foundations of exams

conference contribution
posted on 2024-10-31, 17:20 authored by Judithe Sheard, Simon Simon, Angela CarboneAngela Carbone, Daryl D'Souza, Margaret HamiltonMargaret Hamilton
Previous studies of assessment of programming via written examination have focused on analysis of the examination papers and the questions they contain. This paper reports the results of a study that investigated how these final exam papers are developed, how students are prepared for these exams, and what pedagogical foundations underlie the exams. The study involved interviews of 11 programming lecturers. From our analysis of the interviews, we find that most exams are based on existing formulas that are believed to work; that the lecturers tend to trust in the validity of their exams for summative assessment; and that while there is variation in the approaches taken to writing the exams, all of the exam writers take a fairly standard approach to preparing their students to sit the exam. We found little evidence of explicit references to learning theories or models, indicating that the process is based largely on intuition and experience.

History

Start page

141

End page

146

Total pages

6

Outlet

Proceedings of the 18th ACM conference on Innovation and technology in computer science education

Editors

Janet Carter

Name of conference

ITiCSE 2013

Publisher

Association for Computing Machinery (ACM)

Place published

New York, United States

Start date

2013-07-01

End date

2013-07-03

Language

English

Copyright

© 2013 Association for Computing Machinery (ACM) Inc

Former Identifier

2006041639

Esploro creation date

2020-06-22

Fedora creation date

2013-07-31

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC