An Automatic Grading System Based on Dynamic Corpora
Djamal Bennouar
Department of Computer Science, Bouira University, Algeria
Abstract: Assessment is a key component of the teaching and
learning process. In most Algerian Universities, assessing a student’s answer
to an open ended question, even if it is a short answer question, is a
difficult and time-consuming activity. In order to enhance the learning process
quality and the global student evaluation process and to highly reduce the assessment
time and difficulties, most Algerian Universities were provided with an
e-learning environment as a result of a government initiative. Unfortunately, such
environment seems to be rarely used in the student’s assessment process mainly
due to the inefficiency of its Automatic Grading Subsystem (AGS) and the
underlying corpora. A corpora used in the grading process contains a great
number of miscellaneous answers, each one graded by more than two experts. Building
efficient corpora for a course is actually a challenge. The underlying
subjectivity in grading answers may have a serious impact in the corpus quality
. The specific course context defined by a teacher and the time dependent
grading strategy may make very difficult the construction of traditional course
corpora. This paper presents a short answer AGS which has the capacity to
dynamically build an up to date corpus related to each correct reference short
answer. The automatically generated corpus is mainly based on a variety of
indications specified by the teacher for each reference short answer. The early
experiment of the presented AGS has shown its high efficiency for the automatic
answers grading in some computer science courses.
Keywords: Architectures for educational
technology system, country-specific developments, distance education and e-learning,
evaluation methodologies, Computer Aided Assessment (CAA), AGS, Short answer, corpus,
Answers predicting, text similarity.
Received February 27,
2017; accepted May 10, 2017