by Karen Hao
The artificial-intelligence industry runs on the invisible labor of humans working in isolated and often terrible conditions—and the model is spreading to more and more businesses.
On Wednesday, the Guardian published an article about the realities of producing Google Assistant. Behind the “magic” of its ability to interpret 26 languages is a huge team of linguists, working as subcontractors, who must tediously label the training data for it to work. They earn low wages and are routinely forced to work unpaid overtime. Their concerns over working conditions have been repeatedly dismissed.
It’s just one story among dozens that have begun to peel back the curtain on how the artificial-intelligence industry operates. Human workers don’t just label the data that makes AI work. Sometimes humans workers are the artificial intelligence. Behind Facebook’s content-moderating AI are thousands of content moderators; behind Amazon Alexa is a global team of transcribers; and behind Google Duplex are sometimes very human callers mimicking the AI that mimics humans. Artificial intelligence doesn’t run on magic pixie dust. It runs on invisible laborers who train algorithms relentlessly until they’ve automated their own jobs away.
In their new book Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass, anthropologist Mary Gray and computer scientist Siddharth Suri argue that you and I could be next.
I caught up with Gray this week to discuss why people turn to ghost work, how their invisibility leaves them more vulnerable to terrible working conditions, and how we can make this new form of work more sustainable.
The following has been edited for length and clarity.
MIT Technology Review: How do you define ghost work?
Mary Gray: It’s any work that could be—at least in part—sourced, scheduled, managed, shipped, and built through an application programming interface, the internet, and maybe a sprinkle of artificial intelligence. It arguably becomes ghost work when the proposition is that there are no humans involved in that loop, that it’s just a matter of software working its magic.
So the definition really hinges on how the end product or service is marketed.
Yeah. The work, or the output, itself is not inherently bad or good. It is specifically the work conditions that make it bad or good. Providing a service like those we describe in the book, captioning a translation or cleaning training data for training algorithms—that work is often written off as mundane drudgery. Think of content moderation right now and how it’s sensationalized as something horrific and terrible to do. From the perspective of the workers, it’s a job. And it’s a job that actually takes quite a bit of creativity and insight and judgment. The problem is that the work conditions don’t recognize how important the person is to that process. It diminishes their work and really creates work conditions that are unsustainable.
Companies have a long history of exploiting the labor of less privileged communities. You bring up the example of the fashion industry in your book. Is there something particularly distinct about ghost work that creates even more cause for concern?
In some ways, ghost work is indeed a continuation of the mistreatment of many working people. To me, the dramatic shift is we’ve never quite had industries so completely sell contract labor as automation—not just to make it difficult for a consumer to see the supply chain as we can in textiles, in food, and in agriculture, but also to say that there’s really not a person working here at all. I get chills just thinking: if that is taken to every sector that effectively sells information services, that’s a lot of people and their participation in the economy erased. That also makes it so difficult for workers to organize and to claw back power.