David Wain Chadwick
2 min readSep 4, 2020

--

Can algorithms be mutants?

Boris Johnson decided to sidestep responsibility for the A-level exam results fiasco by blaming a “mutant algorithm”. His description of the algorithmic decision-making involved in creating such disarray for thousands of schoolchildren sounded a bit like a Roman general blaming the Gods for losing a battle after telling his cavalry regiment to take the day off. A bad worksman blames his tools.

The necessity of modern technology means our lives are shaped by algorithms. They guard our inboxes from junk mail, they decide what our social media feeds show us, they help marketeers to pick the right products to try and sell us. But because nobody has ever explained this to us, public understanding of how algorithms work is low, which is probably why Boris thought he could get away with describing them like a character from a sci-fi novel.

Put simply, an algorithm is a calculator. It does what you tell it to. You provide the input and it produces your desired output. You feed it with data and then toggle what outcome you would like it to provide. It is obedient. It is servile. It’s not a young child that appears at the foot of your bed early in the morning regardless of how many times you ask it for a lie in. Even the most advanced usage of algorithms, deep learning, is just algorithms calculating other algorithms.

The exam results weren’t the result of a stray algorithm breaking out of its cage and deciding to ruin students’ lives; they happened because somebody consciously typed the formula to produce them into a computer. The decision to favour results from small schools would have involved a special weighting being given. The final output, improved grades for children from top-performing schools and worse grades for kids from poor-performing schools, would have been clear well in advance. Fortunately, the backlash from these unfair results was strong enough for the government to backpaddle and rely solely on teacher assessments.

If this hadn’t happened it is very likely that students could have made a complaint to the ICO under Article 22 of the GDPR, which says that people “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her”. Forthcoming legislation may give us more of an idea of how to handle the use of algorithms. The government’s own White Paper for Countering Online Harms suggests that social media companies should be more transparent in their use of algorithms, possibly being obligated to provide explanations of how their algorithms work.

When it comes to technology, ignorance is not bliss. Algorithms do not mutate by themselves. The next time a politician tries to fob off a disastrous outcome for thousands of children, remember that algorithms are a human creation, made for humans by humans. We don’t need protecting from the mutant algorithms, we need protecting from the humans making them

--

--