National Qualifications – estimating success: a response to Coronavirus (COVID-19)

Reading Time: 6 minutes

Pupils and students across Scotland will receive their Scottish Qualifications Authority (SQA) grades on Tuesday 4 August 2020.  As ever, the results will be pored over and be the subject of heated political debate.  So far – so familiar.

But, of course, this year will be very different.  The qualifications which get the most attention are National 5s, Highers and Advanced Highers. This year these qualifications will not be based on candidates’ coursework and exams, but rather will be based on their teachers’ judgements.

This blog will look at the cancellation of the 2020 exam diet and the method of certification the SQA has put in place for those qualifications.  

Cancelled exams

The decision to cancel the 2020 exam diet and the outline of the plans for certification of school and college courses happened in a breathless few weeks from mid-March.  Guidance and policy changed almost day-by-day as the implications of the pandemic and the lockdown became clear.

In this period, it was not clear who made which decisions and the options that were considered.  We know that the decision to cancel the diet was made by the Scottish Government and that the SQA was tasked with developing the alternative model of certification. 

The Qualifications Contingency Group which includes representatives of the further and higher education sectors was established on 17 March.  Its remit is to:

  • share information and intelligence to ensure clear understanding of relevant developments in relation to delivery of a certification model for the awarding of qualifications in the absence of the 2020 exam diet
  • promote collective leadership across the education system in Scotland
  • contribute to ongoing communications.

From its remit, it appears that the group had limited involvement with the development of the certification model in 2020.  The SQA did not consult, at least publicly, on how it would approach certification in 2020.

The publications of the SQA focus on explaining how teachers and schools and colleges would support the process of certification and what this might mean for candidates.  The SQA’s rationale for choosing the model it did and whether other models were explored is not explained, beyond fairly broad principles. This is in contrast to Ofqual (the UK Office of Qualifications and Examinations Regulation), which set out its approach and subjected it to public consultation.

Process for 2020

The SQA has identified three principles for this work:

  • Fairness to all learners.
  • Safe and secure certification of our qualifications.
  • Maintaining the integrity and credibility of our qualifications system, ensuring standards are maintained over time.

The core element of certification of exam subjects in schools will be teachers’ estimates of grades of individual candidates.  The SQA explains:

[Teachers] are best placed to have a strong understanding of how your learners have performed and, based on experience and the evidence available, what a learner would be expected to achieve in each course. An estimated grade is not just the result of one prelim or one project, but is a judgement based on activity across the year.

The process estimation by teachers had three stages:

  • Determining the estimated grade (as is done every year).
  • Placing candidates into subdivisions of the existing bands to give refined bands.
  • Ranking all candidates at the school undertaking the qualification within the bands.

The first bullet is part of the routine work of teachers who provide the SQA with estimates in Grade bands every year, e.g. A1 or B3 (upper A and upper B respectively).  This information is used by the SQA as part of the administration of exam results. 

The additional subdivisions in the process for 2020 require teachers to place candidates in finer subdivisions within the Grade bands.  Once teachers had placed candidates into the refined bands, they ranked the candidates within the refined bands.  The SQA has said that the  placing of candidates in rank order is to “support broad statistical analyses”. 

The SQA put in place a number of quality assurance mechanisms.  Before the estimates are provided to the SQA, they should be signed off by two teachers, including the subject lead and then by the head.  In signing off the submission to the SQA, headteachers are asked to consider “how the distribution of centre estimates compares with the performance of the previous three years’ cohorts within the centre, especially for subjects with larger cohorts.”

Fiona Robertson (SQA Chief Executive and Scotland’s Chief Examining Officer)’s 20 April statement about the new system stated:

“We will use the information from these estimates, in addition to prior learner attainment, where this is available. For example, if learners achieved National 5 or Higher courses, in a previous year.

“We will also look at schools’ and colleges’ previous history of estimating and attainment in each subject and level. We may moderate these estimates, up or down, if that is required.

“This process will produce the results for learners, using our national grades for each subject and level.”

It is worth reiterating that the SQA will moderate on the basis of cohorts, not at an individual level.  What statistical tests the SQA might apply and then what actions they might take on the basis of those tests is not clear.  Should the SQA decide to change candidates’ grades, the SQA will not discuss this with the school or college and teachers prior to results day.  The SQA has indicated that more details of this process will be released on results day.

Ofqual has indicated (see summer symposium slides) that the estimated marks in England for, particularly, A-Levels have been overestimated by schools in comparison to previous years.  Ofqual has said that after the standardisation process, results would still be a little higher than usual and that the results in 2020, at a national level, would not be comparable with previous years.

Validity and reliability

It is a debateable point whether a qualification based largely on teachers’ judgement is better or worse than one based largely on an externally verified exam. 

On the one hand a teacher’s judgement could encompass assessment across the whole curriculum in a variety of ways (i.e. orally, working as a team and so on). This compares to a sample of the knowledge taken in the peculiar setting of an exam hall.  On the other hand, a standardised test could be argued to be more reliable than teacher judgement, in that marking of exam scripts can be more tightly moderated. 

In other words, and this is a gross simplification, teacher judgements are potentially good in terms of validity (how well it reflects the learning and achievement of the pupil). But care needs to be taken in terms of reliability (how likely two similar candidates will gain the same grade). Exams are considered reliable but cannot cover the whole curriculum.

Unconscious bias

One of the benefits of external exams is that the person marking the work does not know the individual and therefore the result is, at the stage of certification, blind to the personal characteristics of the candidate, such as gender, race or social class.

One of the issues that has been highlighted since April is how the SQA’s plans accord with its public sector equality duties under the Equality Act 2010.  A key concern was the possibility of unconscious bias in estimating grades.

This issue was addressed in Ofqual’s consultation. It produced an Equality impact assessment: literature review which said—

“Studies of potential bias in teacher assessment suggest that differences between teacher assessment and exam assessment results can sometimes be linked to student characteristics like gender, special educational needs, ethnicity and age. However, such effects are not always seen, and when they are present, are small and inconsistent across subjects.”

A paper by the Equality and Human Rights Commission (EHRC) responding to Ofqual’s consultation asked for specific guidance to be issued to teachers “on the approach which teachers should take to predicting grades and ranking pupils in order to minimise the risk of conscious or unconscious bias”.  In addition, the EHRC stated that data on socio-economic background and the protected characteristics of assessed pupils should be collected by exam boards to be used as part of the standardisation model used by those exam boards. Aside from gender, these details are not routinely collected by the SQA at a candidate level.

Training provided by the SQA addressed the risk of unconscious or implicit bias.  The SQA has indicated that it will publish an equality impact assessment on the alternative certification method on results day.

Post results service

The SQA has announced it will have an appeals service which will be free of charge this year.  SQA guidance states that schools and colleges will be able to ask the SQA to review an awarded grade, if that grade is lower than that estimated by the centre.  As is usual, parents, carers or young people will not be able to ask for a review directly.

Schools and colleges will need to provide evidence to the SQA for each request to review.

Next steps

In correspondence to the Education and Skills Committee, the SQA stated—

“The scale and complexity of the changes required, and at this time of year, are simply unprecedented.”

In relying on teacher judgement the SQA has faced challenges in terms of reliability.  It has sought to minimise this through training and, presumably, through its moderation activity.  From a policy point of view, how the SQA has sought to ensure consistency across the country and in relation to previous years, will be of great interest.

However, at the level of the individual, we can think of this a little differently.  When we ask our loved ones, “how did your exams go?” we are presumably not asking because we are interested in how good they are at doing particular exams on those particular days.  We might be interested because we are proud of their achievements over the full year, or we might be interested in what opportunities they have for further learning or employment.  Our young people will still go to colleges and universities and start modern apprenticeships and begin their careers.

Ned Sharratt, Senior Researcher, Education