A for algorithm or U for unfair?
Danielle Amor
24/08/2020

The chaos that has ensued from the regrading of A-Levels in the UK highlights far-reaching impacts of automated decision-making and why its use is restricted under the GDPR.

The law

Decisions which are based solely on automated processing which “produces legal effects concerning [the data subject] or similarly significantly affects [the data subject]” are prohibited under Article 22 of the GDPR.

There are some limited exceptions – (i) if the decision-making is necessary for the performance of a contract; (ii) if the decision-making is authorised by law and there are suitable safeguards for the rights and interests of the individual; or (iii) the individual has provided their explicit consent.

We can safely assume that A-Level students did not consent to the standardisation process! The other two exceptions do not apply here either.

The algorithm

Following a fairly thorough process, teachers provided a “centre assessed grade” (CAG) for each student in each subject and ranked each student in order from first to last. However, there was concern that the CAGs would be more generous than a usual exam mark. To moderate the grades, Ofqual developed an algorithm which took the CAGs and compared them to the performance of the relevant college and students in previous years to produce standardised grades across each cohort. These standardised grades were then allocated depending on the ranking awarded by teachers.

So, was the regrading solely the result of automated decision-making? Ofqual, in its privacy impact assessment, says not. Rather the algorithm “support[ed] the human intervention involved in the form of [CAGs] and rank orders … determined by teachers and signed off by other individuals within the centre”, with particular emphasis placed on the rank order.

The result

Almost 40% of CAGs in England were downgraded. Students attending smaller colleges and studying for more niche subjects, were more likely to receive their CAG as it was accepted that results are more likely to vary year on year and it was harder for the algorithm to work using less data. Many independent schools fell within this category. Students attending larger colleges and studying more traditional or popular subjects, typically in the state-sector, were less likely to receive their CAG. Further, if a percentage of students at a particular college previously received a U grade, for example, the bottom ranked students at that college were likely to receive a U grade, irrespective of their CAG.

The reaction

Rather than instilling public confidence, the use of the algorithm led to critics accusing Ofqual, exam boards and the government of systemic bias against students from more deprived backgrounds. The UK government and each of the devolved administrations has now announced they will be reverting to the CAGs where these are higher than the standardised grades.

What next

Whilst the immediate fury is likely to subside with the decision to revert to CAGs, threatened legal action under the GDPR and Equality Act could still proceed if university places have been lost due to initial regrading. Watch this space.

Danielle Amor, Senior associate, Danielle.Amor@pannonecorporate-com.stackstaging.com

For further information please contact our specialist data protection team.
Amy Chandler, Partner, Amy.Chandler@pannonecorporate-com.stackstaging.com
Danielle Amor, Senior solicitor, Danielle.Amor@pannonecorporate-com.stackstaging.com
Patricia Jones, Consultant, patricia.jones@pannonecorporate-com.stackstaging.com

Back to homepage