Every year the State Examinations Commission (SEC) undertakes a massive logistical challenge: organising exams for more than 100,000 students, marking countless scripts and issuing about a million grades. To its credit, it manages to organise this smoothly, efficiently and, usually, without major incident. A great strength of the system is that marking takes place anonymously and students have sight of their own marked papers.
However, this week a series of reports in The Irish Times has raised urgent questions over the transparency and fairness of the marking process. An unpublished research paper produced by the commission has concluded that the marking process for the Leaving Cert exams is rushed, unfair and risks compromising the accuracy of students’ grades.
These concerns stem from the way marks for individual questions are altered to ensure up to 100,000 students achieve grades broadly consistent from year to year. The report makes clear that this process “reduces fairness” by benefiting some students and penalising others; renders final marking schemes “less valid”; and “compromises the accuracy of marking”. These are not allegations or accusations from third parties: these are comments made by the SEC itself.
Most of these issues stem from the very unusual approach used in Ireland to ensure there is consistency in exam results each year
It also emerged this week that hundreds of students are being awarded “estimated grades”. There are many valid reasons for this. Students should not be disadvantaged for errors outside their control such as lost scripts. However, a detailed breakdown of these grades raises questions over whether they are being fairly applied in all cases. Some students received estimated grades after complaining of being distracted, noise outside exam centres, stopped clocks in exam centres or ants. The fact that this process was not publicly acknowledged or referred to in any published documents will add to questions over its transparency.
Today, there are allegations over the use of “focused monitoring” in the marking process. This practice is said to involve targeting scripts within particular grade boundaries with the intent of moving grades up or down to bring results into line with previous years. The SEC insists this does not happen, though a number of well-placed sources claim otherwise.
Most of these issues stem from the very unusual approach used in Ireland to ensure there is consistency in exam results each year. Other jurisdictions use techniques which are simpler and fairer. We, on the other hand, are using a process that is convoluted, lacks transparency and which is unfair to some students. The fact that this process undermines the validity of what a mark on an exam means is worrying. If public confidence in the exams is to be maintained, the system of marking must be fair to all students.