banner

Blog

Oct 14, 2024

James Muldoon, Mark Graham and Callum Cant: ‘AI feeds off the work of human beings’ | Artificial intelligence (AI) | The Guardian

The Fairwork trio talk about their new book on the ‘extraction machine’, exposing the repetitive labour, often in terrible conditions, that big tech is using to create artificial intelligence

James Muldoon is a reader in management at the University of Essex, Mark Graham a professor at the Oxford Internet Institute and Callum Cant a senior lecturer at the University of Essex business school. They work together at Fairwork, a project that appraises the working conditions in digital workplaces, and they are co-authors of Feeding the Machine: The Hidden Human Labour Powering AI.

Why did you write the book? James Muldoon: The idea for the book emerged out of field work we did in Kenya and Uganda on the data annotation industry. We spoke to a number of data annotators, and the working conditions were just horrendous. And we thought this is a story that everyone needs to hear. People working for less than $2 an hour on insecure contracts, work that is predominantly outsourced to the global south because of how difficult and dangerous it can be.

Why east Africa? Mark Graham: I started doing research in east Africa in 2009, really on the first of what was to be many submarine fibre-optic cables connecting east Africa to the rest of the world. And what the research was focused on was what this new connectivity meant for the lives of workers in east Africa.

How did you gain access to these workplaces? Mark Graham: At Fairwork the basic idea is that we establish principles of decent work and then we evaluate companies against them. We give them a score out of 10. And that’s how the companies in Nairobi and in Uganda opened up to us, because we were going to give them a score and they want a better score. We went to them with a zero out of 10 and we said: “Look, there’s some work to do to improve.”

And are companies responsive? Do they dispute your low scores? Mark Graham: There’s a whole range of responses. Some argue that the things we’re asking them to do are simply not possible. They’ll say things like: “It’s not our responsibility to do these things.” The beauty of scores is we can point to other companies that are doing them. We can say: “Look, this company does that. What’s wrong with you? Why can’t you have this condition for your workers?”

Can you talk about the echoes of colonialism that you found in this data work? Mark Graham: The old east African railway used to connect Uganda to the port of Mombasa. It was financed by the British government and it was basically used to extract resources from east Africa. What’s interesting about the fibre-optic connectivity in east Africa is that it runs along a very similar path to the old railway, and it too is a technology of extraction.

Could you explain your concept of the “extraction machine”? Callum Cant: When we see an AI product, we have this tendency towards thinking of it as being relatively spontaneously created and we don’t think of the human labour, the resource requirements and everything else that goes on behind it.

The extraction machine for us is a metaphor that allows us to think much more about whose labour, whose resources, whose energy, whose time, went into that process. The book is an attempt to go from this surface level appearance of a sleek webpage or the images of neural networks, to actually look at the embodied reality of when this comes to your workplace, what does AI look like and how does it interact with people?

James Muldoon: I think a lot of people would be surprised to learn that 80% of the work behind AI products is actually data annotation, not machine-learning engineering. And if you take the example of an autonomous vehicle, one hour of video data requires 800 human hours of data annotation. So it’s an incredibly intensive form of work.

How does this concept differ from Shoshana Zuboff’s idea of surveillance capitalism? James Muldoon: Surveillance capitalism best describes companies like Google and Facebook that make money primarily through targeted advertising. It’s an apt description of a data-to-advertising pipeline, but it doesn’t really capture the broader infrastructural role that big tech now plays. The extraction machine is an idea we developed to talk more broadly about how big tech feeds off the physical and intellectual work of human beings, be they Amazon workers, creatives, data annotators, content moderators. It’s really a much more visceral, political and global concept to show the ways in which all our labour is exploited and extracted by these companies.

A lot of the concerns about AI have been either about existential risks, or about how the technology can reinforce inequalities and biases that exist in the data it is trained on. But you are arguing that merely introducing AI into the economy creates a whole number of other inequalities? Callum Cant: We can see this very clearly in a workplace like Amazon. The Amazon AI system, their supply chain organising technology, has automated away the thinking process, and what the humans are left to do in an Amazon warehouse is this brutal, repetitive high-strain labour process. You end up with technology that is meant to automate menial work and create freedom and time, but in fact what you have is people being forced to do more routine, boring and less skilled work by the inclusion of algorithmic management systems in their workplace.

In one chapter of the book you write about Chloe, an Irish actor, who found that someone was using an AI-generated copy of her voice. This bears a resemblance to the recent dispute between Scarlett Johansson and OpenAI. She has a platform and the finance to challenge this situation, most people don’t. Callum Cant: Many of the solutions aren’t actually individual, they rely on collective power. Because as much as anyone else, we don’t have the ability to tell OpenAI what to do. They don’t care if some authors think that they’re running an extraction regime that takes information. These companies are funded by billions and billions of pounds of capital and don’t actually need to care about what we think of them.

But collectively, we identify a number of ways where we could push back and start to try to transform the way this technology is being deployed. Because I think all of us recognise there is an emancipatory potential here, but to reach that, it’s going to require a huge amount of collective work and conflict in many places, because there are people who are becoming immensely rich off this stuff and there are decisions being made by a very, very small handful of people in Silicon Valley that are making all of our lives worse. And until we force them to change how they’re doing that, I don’t think we’re going to get a better form of technology out of it.

What would you say to readers? What action could they take? Callum Cant: People are all in such different positions that it’s hard to give one universal piece of advice. If someone works at an Amazon warehouse, then organise your co-workers and use your leverage against your boss. If someone works as a voice actor, then you need to be organising with other voice actors. But everyone’s going to have to respond to this in their own conditions and it’s impossible to give a diagnosis.

We are all customers of big tech. Should we, for example, boycott Amazon? Callum Cant: I think that organising at work is more powerful but organising as consumers also has a role to play. If there are clear differences and opportunities to use your consumption in a leveraged way, then by all means, especially if the workers involved are calling for that. If Amazon workers call for a boycott on, say, Black Friday, then we encourage people to listen to that. Absolutely. But there’s got to be a set of principles guiding whatever action people take in whatever locations, and the key one of those is that collective action is the major way forward.

3 months oldWhy did you write the book? Why east Africa? How did you gain access to these workplaces? And are companies responsive? Do they dispute your low scores? Can you talk about the echoes of colonialism that you found in this data work? Could you explain your concept of the “extraction machine”? How does this concept differ from Shoshana Zuboff’s idea of surveillance capitalism? A lot of the concerns about AI have been either about existential risks, or about how the technology can reinforce inequalities and biases that exist in the data it is trained on. But you are arguing that merely introducing AI into the economy creates a whole number of other inequalities? In one chapter of the book you write about Chloe, an Irish actor, who found that someone was using an AI-generated copy of her voice. This bears a resemblance to the recent dispute between Scarlett Johansson and OpenAI. She has a platform and the finance to challenge this situation, most people don’t. What would you say to readers? What action could they take? We are all customers of big tech. Should we, for example, boycott Amazon?
SHARE