WHY THIS MATTERS IN BRIEF
Our futures will increasingly be dictated for us by algorithms, and that could be very dangerous for us all.
Interested in the Exponential Future? Join our XPotential Community, future proof yourself with courses from our XPotential University, connect, watch a keynote, or browse my blog.
As the debate around the benefits of Artificial Intelligence (AI) rage, and the spectre of AI bias keeps rearing its ugly head the UK government, who in 2018 proposed using AI to rate schools, opted to use AI to decide students grades after national exams were disrupted by Covid-19 and lockdown, has said that students in England and Wales will no longer receive exam results based on the controversial algorithm they used after accusations that the system was biased against students from poorer backgrounds, Reuters and BBC News report. The announcement followed a weekend of demonstrations at which protesters chanted “fuck the algorithm” outside the country’s Department for Education.
Now, instead of being graded by AI, students will receive grades based on their teachers’ estimates after formal exams were cancelled due to the global pandemic. The announcement follows a similar U-turn in Scotland, which had previously seen 125,000 results downgraded.
In the UK, A-levels are the set of exams taken by students around the age of 18. They’re the final exams taken before university, and they have a huge impact on which institution students attend. Universities make offers based on students’ predicted A-level grades, and usually, a student will have to achieve certain grades to secure their place.
In other words: It’s a stressful time of year for students, even before the country’s exam regulator used a cock-eyed controversial algorithm to automatically estimate their grades.
As the BBC explains, the UK’s Office of Qualifications and Examinations Regulation (Ofqual), whose CEO just quit because of the fiasco, relied primarily on two pieces of information to calculate grades – namely, the ranking of students within a school and their school’s historical performance.
The system was designed to generate what are, on a national level, broadly similar results to previous years. And overall, that’s what the algorithm accomplished, with The Guardian reporting that overall results are up compared to previous years, but only slightly – the percentage of students achieving an A* to C based on the algorithm’s grading rose by 2.4 percent compared to last year.
But it’s also led to thousands of grades being lowered from teachers’ estimations: 35.6 percent of grades were adjusted down by a single grade, while 3.3 percent went down by two grades, and 0.2 went down by three. That means a total of almost 40 percent of results were downgraded. That’s life-changing news for anyone who needed to achieve their predicted grades to secure their place at the university of choice.
Worse still, data suggests that fee-paying private schools, also known as “independent schools,” disproportionately benefited from the algorithm used. These schools saw the amount of grades A and above increase by 4.7 percent compared to last year, Sky News reports. Meanwhile, state funded “comprehensive” schools saw an increase of less than half that at just 2 percent.
There is a variety of factors that seem to have biased the algorithm. One theory put forward by FFT Education Datalab is that Ofqual’s approach varied depending on how many students took a given subject, and this decision seems to have led to fewer grades getting degraded at independent schools, which tend to enter fewer students per subject. The Guardian also points out that what it calls a “shockingly unfair” system was happy to boost the number of “U” grades, aka fails, and round down the amount of A* grades, while one university lecturer has pointed out other failings in the regulator’s approach.
Fundamentally, however, because the algorithm placed so much importance on a school’s historical performance it was always going to cause more problems for high-performing students at underperforming schools, where the individual’s work would be lost in the statistics. Average students at better schools, meanwhile, seem to have been treated with more leniency. Part of the reason the results have caused so much anger is that this outcome reflects what many see as the wider biases of the UK’s education system.
The government’s decision to ignore the algorithmically determined grades has been welcomed by many, but even using teachers’ predictions comes with its own problems. As Wired notes, some studies have suggested such predictions can suffer from racial biases of their own. One study from 2009 found that Pakistani pupils, for example, were predicted a lower score, 62.9 percent, more than their white counterparts in one set of English exams and that results for boys from Black and Caribbean backgrounds can spike when they’re assessed anonymously starting from age 16.
However, while this is just yet another example of how algorithms can make or break your future, whether it’s by being used to decide your grades or your mortgage application, or the outcome of your job interview, and much more, the only thing that’s certain is that they’re here to stay. So, one way or another we now have to solve the problem of AI bias, for example, by developing explainable AI systems and finding fairer ways to create AI models, and the consequences of an increasingly algorithmic first society.