A Parser-based Tool to Assist Instructors in Grading Computer Graphics Assignments

Loading...
Thumbnail Image
Date
2019
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Although online e-learning environments are increasingly used in university courses, manual assessment still dominates the way students are graded. Interactive judges providing a pass/fail verdict based on test sets are valuable tools both for learning and assessment, but still rely on human review of the code for output-independent issues such as readability and efficiency. In this paper we present a tool to assist instructors in grading programming exercises in Computer Graphics (CG) courses. In contrast to other grading solutions, assessment is based both on checking the output against test sets, and through a set of instructor-defined rubrics based on syntax analysis of the source code. Our current prototype runs in Python and supports the assessment of shaders written in GLSL language. We tested the tool in a CG course involving more than one hundred Computer Science students per year. Our first experiments show the tool can be useful to support both self-assessment and grading, as well as detecting grading mistakes through anomaly detection techniques based on features extracted from the syntax analysis.
Description

        
@inproceedings{
10.2312:eged.20191025
, booktitle = {
Eurographics 2019 - Education Papers
}, editor = {
Tarini, Marco and Galin, Eric
}, title = {{
A Parser-based Tool to Assist Instructors in Grading Computer Graphics Assignments
}}, author = {
Andujar, Carlos
and
Raluca Vijulie, Cristina
and
Vinacua, Alvar
}, year = {
2019
}, publisher = {
The Eurographics Association
}, ISSN = {
1017-4656
}, ISBN = {}, DOI = {
10.2312/eged.20191025
} }
Citation