Code Dependent by Madhumita Murgia review – understanding the human impacts of AI


What’s in the box? That’s the question almost everyone that Madhumita Murgia speaks to seems to be asking. If the “black box” of the algorithm is going to make critical decisions about our health, education or human rights, it would be nice to know exactly what it contains. So long as it remains a mystery, what can you do if you find your child added to a list of potential future criminals based on flawed, even racist, data, as happened to hundreds of families in the Netherlands in the late 2010s?

It’s these Kafkaesque absurdities, and how they play out on a human level, that interest Murgia, the Financial Times’s first artificial intelligence editor. Code, she reminds us several times in this troubling book, is not neutral.

This isn’t a story about ChatGPT and the other large language models and their looming impact on everything from Hollywood to homework, though there is a bit of that. Instead, it’s an account of how the everyday algorithms we have already learned to live beside are changing us: from the people paid (not much) to make sense of vast datasets, to the unintended consequences of the biases they contain. It’s also the story of how the AI systems built using that data benefit many of us (you, ordering McDonald’s on UberEats) at the expense of some – usually individuals and communities that are already marginalised (the young immigrant worker picking up your Big Mac for a small fee).

The scope of Murgia’s reporting here, reflective of her day job, is vast. She takes us from the sometimes comically basic way AI systems are trained (workers in a Kenyan office block labelling road signs to teach driverless cars to recognise them) to how its flaws play out in the finished product (the delivery drivers paid less because their app doesn’t account for delays caused by roadworks or having to cycle uphill).

Murgia also argues that we are seeing the emergence of a new data colonialism. Yes, many subcontracted AI labourers are pulled out of poverty by this work, but the wealth created by it is not shared equitably. Then there’s the sheer monotony of it, the inability to deviate from instructions and regional disparities in pay and job security, not to mention the PTSD that results from being forced to look at the worst images on the internet so the rest of us don’t have to. And, crucially, because the AI supply chain is broken down into chunks, many of these workers are told nothing about the purpose of the task they have been given – or even who they’re working for.

One Kenyan lawyer Murgia meets sees algorithm training as another version of the Bangladeshi clothing industry that supplies western fast-fashion brands. Or even the designer goods sector: “The factory worker just thinks, all I’m making is a shoe. They don’t know their shoe is being sold for $3,000.”

skip past newsletter promotion

There is a little optimism. Murgia – a former Wired writer – is alive to the potential of AI to improve health outcomes. And we meet people such as Hiba, whose family, refugees from Falluja, in Iraq, used their work as data labourers to fund a new life in Bulgaria. There are also cheering stories of how frustration at being exploited has led to gig economy workers quietly organising – even in China – to regain some of the autonomy they have sacrificed at “the altar of the algorithm”.

But the bass note here is pessimistic. We are way past the techno-boosterism of the early 00s, and, for every government official wondering how AI can help them streamline health and welfare services, there are thousands of people asking whether it will allow them to continue making a living. Worse than that, as in the Chinese government’s facial recognition systems and pre-emptive detention lists in Xinjiang, it’s the story of a dystopia we are already living in.

Code Dependent: Living in the Shadow of AI by Madhumita Murgia is published by Picador (£20). To support the Guardian and the Observer buy a copy at Delivery charges may apply.


Leave a Reply

Your email address will not be published. Required fields are marked *