For the “Computing in Molecular Biology” courses we evaluate what students have learned using exams in digital forms. To do this in a fair and transparent way I use an automatic process. Here I describe it.
Presenting the exam to the student
The questions of the exams are delivered on paper at the begin of the examination. In the text there is a link to a webpage, from where the students have to download a template for the answers. The answers have to be written in a text file, the most open and accessible standard, and the basic tool of scientific computing. The students write their answers in the space given to them, respecting the structure of the template. An example of the template can be seen here.
Once they have finished, they send their answer file attached in an email to my account at Istanbul University. The destination address includes the tag
+cmb, that allows an automatic rule in the server to classify them in the folder of the course. They are also automatically copied to my Google Mail account. This way the answers are backed up in two systems, where the delivery date get registered in a way that cannot be modified by third parties.
Electronic signature of answers
Immediately after the last answer has been delivered I download all the attached files in messages having the
CMB tag that were received on the day of the exam. This is done automatically and exhaustively with the program
attach_downloader.py that uses the official Google Mail Application Programming Interface to access my account. This program saves each attachment using a filename that is the electronic signature of the student’s answer. This has two purposes:
- The electronic signature of the file would change if any part of the file is modified. Therefore, it is a guarantee that the file that I evaluate is exactly the same as the student sent. Nobody can change it without being noticed. Not me, not the student, no one else.
- All the exam evaluation process hides the name of the student, so the grades only depend on the content of the answer and not on who wrote it.
Once the files have been saved (and backed up in at least two different servers), I send the registry of which electronic signatures correspond to each student to the forum of the course at Google Groups. This way there is a public record (which cannot be modified) of what I received in the exam date. Each student gets the same email at the same time, so even if the forum is not accessible, there are many copies of the registry.
In the email to the students they are encouraged to verify that the file I got is the same file they sent. This can be easily done on the website http://onlinemd5.com/ without uploading the file. If any error is found, the students can request to correct it before the questions are graded. Electronic signatures are the current international standards used to guarantee the document integrity, authenticity and authorship. Nevertheless, if required, at this stage we can produce a paper copy of every document.
Since all the students used the same template, all answers have the same structure. This allows us to use a program to extract each one of the questions and gather them in a single question file. That is, all answers to question 1 are put together in a single file named
Q1, answers to question 2 are put in a file
Q2, and so on.
The order of the students in the single question file is randomized using a shuffle function. Still, each student is identified with the electronic signature of the original file. This strategy has several benefits:
- The grading process is easier if each question is evaluated separately. The professor does not need to switch context and can focus on a single topic
- It is more fair, since it makes easy to put the same grade to answers of the same quality
- It is less biased, since shuffling the order means that the students that are graded first in one question are graded last in other questions. It has been shown that the relative order of the evaluations introduce an unwanted bias on the grades (Danzigera et al.)
The grades are written directly on the single question file. Later, another program collects all partial grades into a summary Excel file. This program verifies that all questions are evaluated and that no part is missing. In this file we associate each Student Number with their corresponding document electronic signature. For additional confidence, we verify manually that each student got the correct grade. This file is saved in
.csv format and uploaded directly to AKSYS.
Using all these automatic tools reduces the risk of misplacing grades and other mistakes that can hinder the student’s grades.
Showing the exam to the students
Again, using a structured document allows us to easily create a document for each student, showing all his/her answers with the corresponding grade. All final answers are accessible on the web, in separate pages for each student.
If a student finds any error, mistake or omission, the grades can be easily corrected and updated.
This evaluation process, made specifically for the Computing in Molecular Biology courses, can be useful on other environments where answers are submitted using computers. Several of the ideas implemented here can have important advantages towards a fair evaluation process.
In the future we can improve this process using a dedicated web server with special software that replaces the email system.
All programs are freely available at http://github.com/anaraven/gmail-tools.
GILES, R. M., JOHNSON, M. R., KNIGHT, K. E., ZAMMETT, S. and WEINMAN, J. (1982), Recall of lecture information: a question of what, when and where. Medical Education, 16: 264–268. doi:10.1111/j.1365-2923.1982.tb01262.x
Cuseo, J., Fecas, V. S., & Thompson, A. Thriving in College & Beyond: Research-Based Strategies for Academic Success & Personal Development. Dubuque, IA: Kendall/Hunt. 2007.
Classroom Seating Position and College Grades https://www.altoona.psu.edu/fts/docs/SeatingPositionGrades.pdf
Extraneous factors in judicial decisions. Shai Danzigera, Jonathan Levavb, and Liora Avnaim-Pessoa
ENRICO GNAULATI (SEP 18, 2014), Why Girls Tend to Get Better Grades Than Boys Do. The Atlantic, http://www.theatlantic.com/education/archive/2014/09/why-girls-get-better-grades-than-boys-do/380318/
Voyer, Daniel, and Susan D. Voyer. “Gender Differences in Scholastic Achievement: A Meta-Analysis.” Psychological Bulletin 140, no. 4 (2014): 1174–1204. doi:10.1037/a0036620.