WHY THIS MATTERS IN BRIEF Our futures will increasingly be dictated for us by algorithms, and that could be very dangerous for us all. Interested in the Exponential Future? Join our XPotential Community, future proof yourself with courses from our XPotential University, connect, watch a keynote, or browse my blog. As the debate around the...
Now, instead of being graded by AI, students will receive grades based on their teachers’ estimates after formal exams were cancelled due to the global pandemic. The announcement follows a similar U-turn in Scotland, which had previously seen 125,000 results downgraded.
In the UK, A-levels are the set of exams taken by students around the age of 18. They’re the final exams taken before university, and they have a huge impact on which institution students attend. Universities make offers based on students’ predicted A-level grades, and usually, a student will have to achieve certain grades to secure their place.
In other words: It’s a stressful time of year for students, even before the country’s exam regulator used a cock-eyed controversial algorithm to automatically estimate their grades.
The system was designed to generate what are, on a national level, broadly similar results to previous years. And overall, that’s what the algorithm accomplished, with The Guardian reporting that overall results are up compared to previous years, but only slightly – the percentage of students achieving an A* to C based on the algorithm’s grading rose by 2.4 percent compared to last year.
But it’s also led to thousands of grades being lowered from teachers’ estimations: 35.6 percent of grades were adjusted down by a single grade, while 3.3 percent went down by two grades, and 0.2 went down by three. That means a total of almost 40 percent of results were downgraded. That’s life-changing news for anyone who needed to achieve their predicted grades to secure their place at the university of choice.
Worse still, data suggests that fee-paying private schools, also known as “independent schools,” disproportionately benefited from the algorithm used. These schools saw the amount of grades A and above increase by 4.7 percent compared to last year, Sky News reports. Meanwhile, state funded “comprehensive” schools saw an increase of less than half that at just 2 percent.
There is a variety of factors that seem to have biased the algorithm. One theory put forward by FFT Education Datalab is that Ofqual’s approach varied depending on how many students took a given subject, and this decision seems to have led to fewer grades getting degraded at independent schools, which tend to enter fewer students per subject. The Guardian also points out that what it calls a “shockingly unfair” system was happy to boost the number of “U” grades, aka fails, and round down the amount of A* grades, while one university lecturer has pointed out other failings in the regulator’s approach.
Fundamentally, however, because the algorithm placed so much importance on a school’s historical performance it was always going to cause more problems for high-performing students at underperforming schools, where the individual’s work would be lost in the statistics. Average students at better schools, meanwhile, seem to have been treated with more leniency. Part of the reason the results have caused so much anger is that this outcome reflects what many see as the wider biases of the UK’s education system.
The government’s decision to ignore the algorithmically determined grades has been welcomed by many, but even using teachers’ predictions comes with its own problems. As Wired notes, some studies have suggested such predictions can suffer from racial biases of their own. One study from 2009 found that Pakistani pupils, for example, were predicted a lower score, 62.9 percent, more than their white counterparts in one set of English exams and that results for boys from Black and Caribbean backgrounds can spike when they’re assessed anonymously starting from age 16.
However, while this is just yet another example of how algorithms can make or break your future, whether it’s by being used to decide your grades or your mortgage application, or the outcome of your job interview, and much more, the only thing that’s certain is that they’re here to stay. So, one way or another we now have to solve the problem of AI bias, for example, by developing explainable AI systems and finding fairer ways to create AI models, and the consequences of an increasingly algorithmic first society.
Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series.
Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future.
A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries.
Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
FANATICALFUTURIST PODCAST! Hear about ALL the latest futures news and breakthroughs!SUBSCRIBE
EXPLORE MORE!
1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.