The following is an excerpt from Work Without the Worker: Labour in the Age of Platform Capitalism (2021) by Phil Jones, reprinted with permission from Verso Books.


A drone hovers hornet-like above São Paulo’s colossal favela Paraisópolis. It traverses the slum’s territory, listlessly gliding above the shanties, perhaps to transmit images to the Military Police Operations Centre, a violent state apparatus that mercilessly represses the slum’s inhabitants. The drone then drifts over the home of someone who has just logged onto the site Scale, which sources workers across the Middle East and Latin America to label images used to guide automated drones systems.1 The worker remains unaware of what is taking place above them, just as they remain unaware of the purpose of their work. Whether the tasks power autonomous weapons systems raining disaster down on slum districts or else power geographic data for humanitarian agencies that provide aid to such disaster zones is knowledge not available to the workers. Nothing about the tasks in and of themselves reveals their purpose. Workers remain at the behest of the good faith of requesters to offer such information – an extravagant kindness one would imagine is rare.

If microwork represents a shift in the contours of informal sector work it also announces a new, dismal installment in the treatment of those marginal to the wage. In ways beyond Marx’s most vivid nightmares, the poor and dispossessed now unwittingly train the very machines built to track their movements and terrorise their communities, or else replace their role in the labour process. Indeed, it should be emphasised that these nascent methods of platform capital may not represent a divergent economic path so much as omens of a coming world in which the primary or secondary role of most work is to feed machine learning systems. Microwork may, then, represent a crisis of work in its fullest etymological sense: that is, a turning point. Thus, the phenomena described in this chapter are perhaps not merely features of microwork, but early experiments in how to organise a whole range of subemployed pursuits amid capitalist decay.

 

Black Box Labour

If modern economy triumphed under the aegis of a new rationalist mythos – the worker who freely enters the wage as a rational agent – then microwork either reveals the vacuity of a much-treasured fable or else denotes our arrival in a new world. The answer is perhaps both. The boosterish claims of bootstrap doctrinaires and wage fabulists have, of course, always inflated if not entirely distorted the degree of knowledge available to a given economic actor. Even so, microwork does suggest something of platform capitalism’s arrival at a new kind of subject, no longer enlightened by knowledge, but plunged into the darkness of data and the opaque worlds it creates. In certain ways, microwork seems a perfect exemplar of what James Bridle terms our ‘new dark age’, a refraction of the enlightenment, where tools supposed to illuminate our world throw us into new kinds of techno-induced ignorance and, eventually, barbarism.2

But the new ignorance has old class roots. Inequities around who sees and who remains blind have undoubtedly been aggravated by recent innovations in ‘Big Data’, so many of which are perhaps more bluster than reality, with companies like the data analytics consultancy Acxiom confidently promising clients a panoptic ‘360-degree customer view’. But capital has long claimed the status of prophet, often by simply impeding the vision of workers. The difference now perhaps is that, as algorithms make ever more decisions automatically, ever more of reality takes place behind our backs. For algorithmic sorcery to remain the rarified precinct of data mystics and arbitrageurs, new kinds of economic blindness must be conjured.

Myopia undoubtedly affects those at the furthest reaches of other supply chains too – someone stitching garments in Bangladesh for Primark may not know, ultimately, which company their labour serves. More generally, the factory worker or shop assistant remains unaware, on some level at least, of their exploitation – hence the oft-quoted line from Marx’s Capital, ‘they do not know it but they are doing it’.3 But workers do know they are producing a tyre for a car or selling a garment for someone to wear. Even someone working for a company that manufactures nuts and bolts for distant military contractors is able, with some research, to figure out the nature of their work. Microwork, however, thins the aperture of knowledge to a tiny sliver of light, divesting workers of the capacity to know what they are doing and to what end. The Bangladeshi tailor knows they are making a shirt for someone to wear, even if they do not know which company will eventually sell it. The shirt has a tangible use the tailor can readily perceive. The worker on Clickworker, on the other hand, often has little idea of what they are creating. One might say that, in every instant the tailor can see, the microworker is blind.

This is in no small part because the tasks exist at such a high degree of abstraction it becomes impossible to relate them to anything like a meaningful whole. More importantly, though, microwork sites are ‘like clandestine installations on unmapped territory; too little is known about them’.4 Unlike the nuts and bolts made by a worker for Ford, the coffee served for Starbucks or the survey handled by a call centre worker, the products of microtasks are often hidden away from workers for reasons of secrecy. When transcoding audio of voices, the worker knows they are putting into writing the words of a speaker with an Irish accent. But there is no sense of what this recording actually is (e.g., data for a chatbot algorithm) or how it will be used (e.g., to automate fast-food restaurants). Such information is concealed by big tech cabals who rely on microwork sites to facilitate projects of a secret nature.

Google’s use of microwork for a US Department of Defense initiative, Project Maven, is a case in point.5 In one of many secret deals between the US military and big tech, the Pentagon contracted Google to develop an artificial intelligence program capable of sorting thousands of hours of drone video, ultimately with the goal of helping the military identify targets on the battlefield. For the program to be useful, it would need to learn how to differentiate objects into ‘buildings’, ‘humans’ and ‘vehicles’. Partly to keep costs low, but also to keep the project private, Google contracted the services of Figure Eight (now Appen), a microwork site that specialises in data annotation. Via the Figure Eight platform, taskers then provided algorithms with the requisite data sets by identifying objects in CAPTCHA-like images taken from the footage. In so doing, workers unwittingly helped Pentagon officials to engage in ‘near-real time analysis’ – to ‘click on a building and see everything associated with it’.6 The anonymity here afforded Google, alongside the highly abstract nature of the videos, meant workers could not see who they were working for and what they were working on – a drone video does not immediately reveal itself as a tool of war, likely appearing as innocuous footage of an urban area.7

A team of sociologists found that workers annotating data for autonomous vehicles similarly had little idea about what they were working on:

Some respondents mentioned a task they called ‘motocross’ where they had to identify roads and tracks in photographs and to indicate the nature of the ground (pebbles, road, sand, etc.). Some thought it was for a video game, others for a census of racetracks. This is because, as we soon realized, requesters vary widely in the extent to which they provide detailed information on their tasks, and on the purposes they serve, leaving workers often confused.8

This is a particular problem when – as with Project Maven – the technologies supported by microwork are built for explicitly oppressive ends. A particularly grim example: requesters are not obliged to state that face-tagging tasks – common across all platforms – are used to train facial recognition algorithms. Modelled on eugenicist theory, the software is used to capture people’s faces and compare the photos to existing databases, with the aim of identifying and locating people – often producing highly racist results.9 Only the latest strategy in the militarization of urban space, facial recognition has unleashed a police armageddon on poor neighbourhoods, most notably in the vast carceral cities of Los Angeles and Shanghai. The LAPD has used the software around 30,000 times since 2009, often to defend richer enclaves from ‘gang crime’.10 In the wake of Covid-19, use of the software was ramped up across the globe, but most evidently in many Chinese cities. Ostensibly used to help track the virus, the technology’s more obvious purpose has been to track and detain minorities. Most disturbingly, the technology has been central to an ethnonationalist cleansing project that has seen the Chinese state intern growing numbers of the Uyghur population in concentration camps. The state-owned ‘commerce’ platform Alibaba now offers clients software that has the express purpose of identifying Uyghur faces.11

The tasks that power these authoritarian nightmares are central to the service platforms like Mechanical Turk offer to requesters.12 More pertinently, Amazon likely uses the service internally to train its own controversial software, Rekognition, described by the company – in terms as vague as they are sinister – as a tool for monitoring ‘people of interest’.13 That the software has been contracted to many police departments and pitched to a number of security agencies, including US Immigration and Customs Enforcement (ICE), only further suggests its racialized targets.14 Recent decisions by IBM, Amazon and Microsoft to stop contracting these technologies to police departments seems more a considered calculation of the PR risks in light of growing support for Black Lives Matter than a genuine ethical commitment, suggesting that if or when support wanes such deals will be back on the table.

Other companies such as the menacingly named Clearview AI continue without shame or mercy to contract the software to agencies like ICE.15 The short data tasks that ultimately benefit these agencies are entirely divorced from the oppression they conjure, lacking descriptions that directly link them to the technology or any indication of which firms are contracting them. Unable to see who or what the tasks empower, workers blindly develop technologies that facilitate urban warfare and cultural genocide. It is a grim irony that the refugees who use microwork sites are effectively forced to create the very technology that directly oppresses them, a further though by no means new twist in the capitalist tale of machines subjugating workers to racist structures.

Part of the problem is the sheer number of sites and interfaces among which workers are shuttled on a daily basis, making identifying the kind of work one is involved in close to impossible. The platform for which work is actually being completed hides behind complex multilayered structures, whereby different roles are taken by different sites. The worker might believe they are completing tasks on YSense, when in fact the platform is only acting as an agent for Appen, itself hosting tasks for Google.16 As corrupt avatars for big tech, microwork sites hide the new satanic mills of firms that ‘do no evil’.17

Vendor management systems (VMSs) add a further layer of opacity to already murky chains of outsourcing.18 These systems recruit and supply workers for sites like Microsoft UHRS and Google Raterhub, acting as agents that in some cases pose as microwork platforms in their own right. Further obscuring the waters, some companies like Clickworker act as microwork sites and VMSs simultaneously, hosting tasks from a range of smaller requesters as well as supplying labour to bigger clients like Microsoft UHRS. VMSs are often used by larger platforms in tandem with nondisclosure agreements (NDAs) to keep their use of microwork quiet. For instance, Google used a VMS to hide workers on EWOQ, the company’s highly enigmatic precursor to Raterhub.19 Such diligent attempts to vanish its raters are more fundamentally efforts to conceal the secrets that power its predictive PageRank algorithm, utilising the same methods of NDAs and VMSs that Facebook uses to contract moderators and fortify its own algorithmic edifice.

As the world’s poor are corralled into helping a platform plutocracy predict the future, the present necessarily becomes a less predictable terrain. Effectively working inside a black box, workers are divested of all the usual ways to orient themselves inside the labour process.20 There are no managers, only algorithms; no fellow workers, only avatars of competitors; no obvious points of contact or information. Work is a realm of ‘unknown unknowns’, of shadows playing across the wall and ‘black swans’ appearing out of the dark, where all that remains visible is the task directly in front of them. Big tech companies lurk in the shadows, tasks are obscure, while accounts are closed and requesters vanish without warning. Blind and isolated, one struggles to see what one’s labour precisely is and who it benefits, just as one struggles to defend oneself against an employer about which nothing is known.

The worker, then, plays nightwatchman to a shadowy algorithm. They may know that training data is fed into the algorithm and that a decision comes out of the other side, but what goes on in between remains entirely opaque.21 This opaque space represents a black box, a dark patch covering something of significant social effectivity, entirely impenetrable – for reasons often of power and secrecy – to those outside its workings. Hidden is how the algorithm makes the decision – on what grounds, for whom and with what aim. As appendages to these algorithms – refining, enhancing and supervising their capacities – workers spend their days in this shadowy netherworld, neither able to see the process on which they labour nor readily seen by those outside its parameters. This is how larger platforms want their labour: obscure to those doing it and invisible to the wider world.

 

Workers without a Workforce

The aim, however, is to conceal not just a wider labour process, but workers from each other. Platform interfaces provide no messaging services or profiles that workers can access. This is partly to foreclose potential militancy but more fundamentally to prevent a workforce, in any conventional sense, from coming into existence at all. Thousands of workers in contact with each other would heighten the risk of a secret project being made public. But it would also threaten to dispel the algorithmic illusion and thus disrupt the financial interests these sites uphold. The threat is no more palpable than to firms using microwork to disguise their workers as machines in a bid to attract venture capital. As Lilly Irani points out:

By hiding the labour and rendering it manageable through computing code, human computation platforms have generated an industry of start-ups claiming to be the future of data. Hiding the labour is key to how these start-ups are valued by investors, and thus key to the speculative but real winnings of entrepreneurs. Microwork companies attract more generous investment terms when investors perceive them as technology companies rather than labour companies.’22

For the whole nexus of corporate reputation, financial circuitry and technological spectacle that sustains platforms to remain intact, workers must remain out of sight. Whether for the purposes of sourcing venture capital or concealing a clandestine project, microwork vanishes big tech’s dirty little secrets. In place of workers one finds a cheerful pageantry of machines, an exhibition of innovation and fanciful valuations. All that can be seen from the outside is the apparent successes of entrepreneurs and programmers, not the mundane exploitation of day-to-day capitalism. To achieve this, workers must be kept apart, not only by oceans and borders but by software interfaces that segregate the workforce, making it not only invisible to requesters but to itself.

In the wake of the Covid-19 pandemic, the tactics by which this is achieved are increasingly valorised under the rubric of remote work. Work outside of the workplace, taking place either in living rooms or coffee shops, perfectly harmonises with the model of labour Silicon Valley has been incubating over the last decade, one where workers never meet or communicate. This forms a single aspect of the cloistered digital world envisaged by the likes of Amazon and Facebook, where all interactions, whether civil, political or economic, take place on platforms accessed from the comfort of our own homes. The postpandemic world is set to be one where the argument for more contact as opposed to less will be an increasingly difficult one to make.23 Realising this world in the labour market, microwork represents the apex of neoliberal fantasy: a capitalism without unions, worker culture and institutions – indeed, one without a worker capable of troubling capital at all. As if bringing to life capitalism’s fever dreams, microwork undermines not only the wage contract, distinct occupations and worker knowledge, but the workforce as unified, antagonistic mass.

 

Data Nightmares

That the world’s jobless and marginalized are being corralled into powering the drones that hover over their homes and the cameras that identify and deport them is perhaps only as depressing as it is unsurprising. There is, however, another more sinister experiment taking place on informal workers in the recesses of silicon capital, of which Mechanical Turk is perhaps exemplary. At first glance, it’s not entirely obvious what Amazon gets out of the platform. The site hardly represents a significant venture, at least not in the sense of obvious profits. The sum total that the platform takes from transactions per annum represents, by any calculation, a mere drop in the vast ocean of Amazon’s annual revenue. Factor in the costs of running the site and its profitability seems somewhat dubious.

But reading through the small print of the site’s terms and conditions for workers, it soon becomes obvious what Amazon’s real agenda is: ‘The Task content that you upload and work product that you receive via the Site may be retained and used to improve the Site and other machine learning related products and services offered by us’.24 Given more than a second’s glance, one realises these words suggest something rather novel: each task completed on the platform automatically sends Amazon a precise data set about how it was completed. Mechanical Turk may appear as a labour broker, an intermediary that takes a cut for hosting exchanges between workers and employers, but its real purpose is to provide data for Amazon Web Services.25

Just as Mechanical Turk allows Amazon to broaden the scale and scope of its data capacities, many smaller microwork sites have data-trading agreements that benefit larger platforms. In its online terms and conditions, Playment states that ‘the work-product the User collected and/or generated whether by answering questions, taking photos etc. becomes Playment’s property.’26 Because the product described here is labelled or categorised data – a nonrivalrous resource – both requester and Playment can enjoy its use simultaneously. Like Mechanical Turk, Playment receives the data content of a task simply by acting as an intermediary. But unlike Mechanical Turk, which operates solely for Amazon’s interest, Playment shares this data with third parties, one of which is Facebook.27 The social media site is used by Playment to build up profiles of its workers’ friends and predict which of these contacts might also want to work on the site. In the process, we should expect that Facebook receives a wealth of annotated data on a variety of tasks.

While the privacy terms and conditions for other sites are not available unless one signs up for work, one can speculate with some certainty that Raterhub grants Google access to Appen’s vast trove of labour data and that Microsoft uses UHRS to access data on Clickworker. We should think here in terms of data promiscuity, whereby the range of its uses stretches far beyond a company like Microsoft’s immediate requirements. A microwork site’s ability to attract a client as big as Microsoft or Facebook relies on a perceived ability to grow its and, in the process, the client’s access to ever richer and more varied sources of data. Data has a centripetal force in networks, forever moving toward the larger platforms at the centre. To put it another way: because networks are only ever hierarchies in disguise, the more microwork sites a company like Microsoft has in its orbit the greater the range of data it can capture.

It is not entirely coincidental, then, that the financial mechanisms that sustain microwork sites encourage data practises that ultimately benefit larger platforms. To remain solvent, a site like Playment is compelled to collect data to attract capital and grow financial valuations, at least to the extent that venture capitalists regard data-rich platforms as more competitive, efficient and innovative than those without such capacities.28 In other words, Playment’s financial viability rests less on the labour service it provides and more on the data it collects – data which ultimately funnels up to Facebook and Google.

From the platform’s perspective – and indeed that of many larger requesters – the utility of the worker may be far closer to a user of Facebook or Google than a wage labourer. The product of the task itself is often less useful than the data about how it was created. One might argue that this simply extends more traditional management strategies of collecting data to optimise organisation and workflow.29 Because workers on Mechanical Turk contribute data about the work process itself – the ways workers behave, how they complete tasks, when and how often they log in, and how quickly tasks are completed – the data can be fed back into the platform or even into algorithms used in, say, Amazon warehouses, which require a range of behavioural data to effectively monitor and control worker performance.

The surveillance potential finds its bleakest realisation in Chinese data factories, which operate an alternative model of data labelling to remote microwork. Aside from a few large urban plants that, like melancholic testaments to a less automated past, are housed in old concrete and tech factories, much of the industry is emerging in small towns and rural areas, providing work to jobless blue-collar workers, who would otherwise migrate to cities and likely join the country’s seemingly boundless informal population.30 Already numbering over six thousand countrywide and taking over the employment of entire villages, these airless ‘data farms’ are likely to stretch as the century progresses into digital company towns or even data haciendas, where whole rural expanses are converted into tech-owned estates to which a dispossessed migrant class is effectively shackled.31 Unlike their remote counterparts on Appen and Lionsbridge, workers are penned up in offices that in their bland claustrophobia bear more than a passing resemblance to call centres. The close proximity of workers makes it easier for companies to collect physiological data than from those working remotely. As the worker completes a particular task, say, labelling medical images, the company records their gaze and bodily movements such as keyboard strokes, the time they take to complete the task, and how accurately they do so. Management becomes an unbearable kind of scrutiny, a continuous surveillance of the worker’s bodily responses in high definition. By mapping the labour process so intricately, managers can steer specific tasks to high-performing workers in real-time.32 By the same token, given their precarious status, low-performing workers can be readily discarded by hawkish algorithms.

The ‘algocratic’ oppression of today’s digital Taylorism differs only in extent from the management styles of twentieth-century economy.33 The real difference lies outside of management, in the use of data to enhance machine learning services. Hovering over every exchange between requester and worker, Mechanical Turk can, for instance, funnel data from a short translation task into Amazon Translate, an automated neural network machine provided by Amazon Web Services (AWS). Amazon gets all of this data simply by acting as host. Here we find the primary function of Mechanical Turk: a barely profitable, potentially even unprofitable, labour platform cross-subsidising Amazon’s wider business operations as a logistics and software company.34 Mechanical Turk is interested less in the levy on transactions, more in the data about the work process.

The worker as engine for machine learning is not such an absurd proposition when considering the idiosyncrasies of a company like Amazon’s wider business model. In many respects, much of what Amazon does differs little from the model of Victorian capitalism. Precarious labourers are still marshalled into warehouses and compelled to endure long hours to package goods and churn out surplus capital. But Amazon is less the ‘everything store’, more a universal logistics system. As Malcolm Harris drolly notes, ‘more than a profit-seeking corporation, Amazon is behaving like a planned economy.’35 The vast warehouses, the delivery vans, the Amazon stores are all physical expressions of a computerized logistical system which distributes labour, goods and information. Every aspect of Amazon’s business model is geared toward enhancing its computational power. Amazon Prime, for instance, loses money on each order, and only exists to attract customers onto the platform who leave the data required to power its logistics and cloud services. As Kim Moody notes:

Information technology links all aspects of logistics from the movement of goods over roads, rail, air, sea, to the various distribution and fulfilment facilities and their internal functioning. Huge data warehouses or centres are a key part of this physical supply chain infrastructure and central to the effort to speed up and smooth out the movement of goods and money.36

In the process of becoming a logistical giant, the company developed AWS, initially an internal service for data storage, software applications and computational power that has since come to provide the majority of Amazon’s operational income.37 Now the global leader in cloud computing, AWS provides governments with data storage space, the military with algorithmic power and other companies with logistics solutions and machine learning. Ever more businesses and governments now rely on Amazon to organise and store their data, which takes an infrastructure just as large as the one for physical things, comprised of huge data centres expanding in number and size every year.38

Like Google’s knowledge monopoly and Facebook’s ‘social industry’, Amazon’s logistics behemoth displays an increasingly totalitarian style of economy.39 The growing number of partnerships between the tech giants, as well as their contracts with various government agencies, stage a ghoulish capitalist politburo that aims to deliver a kind of data-determined social harmony, as enchanting as it is ruinous.40 One can reasonably conclude that in this imagined future, the principal means of expropriation is no longer the wage relation but data capture, where the platform class no longer relies on labour so much as social activity, drawn from the habits and movements of day-to-day life. Looking at the range of services the platform giants hope to one day automate – warehouses, delivery, human resources, health and finance, to name but a few – we see a future in embryo, one in which the wage is effectively abolished, in which huge conglomerates stretching into cosmic totalities continue to own and control the means of production but no longer employ people, whose primary role is simply to feed machines data via their daily activities. This imagined future haunts the world of microwork, where data about a task is often more important than the task itself. Work as a productive activity becomes secondary, but it does not disappear. Rather, in becoming increasingly marginal to the interests of a system no longer creating jobs, it permeates the entire social landscape, as workers desperate for income are forced to turn every waking hour into monetizable activity. A global ‘servant economy’ beckons, writes Jason E. Smith, which ‘would push commercialism into the deep pores of everyday life, and make resisting it a crime. You would have to treat people kissing each other for free the way they treated poachers in the nineteenth century.’41

Unwittingly or unwillingly, microworkers are corralled into doing the tasks that promise such a world. No mere speculations, such visions are like refracted images of our own stagnant economy of miserable service work, presided over by an increasingly authoritarian state-market nexus, which in recent years has found a faithful lieutenant in the AI industry. The latter of these qualities puts us in mind of the Chinese state, wherein a repressive repertoire of ubiquitous facial recognition, biometrics and personal device tracking has been enlisted toward a growing social credit system that rewards conformity and punishes subversion.

Silicon Valley, of course, has its own authoritarian impulses. A far-right contingent – computer scientist Curtis Yarvin, blood-drinking cofounder of PayPal Peter Thiel, and proto-fascist politico Steve Bannon – has coalesced around the ideas of the neoreactionary prophet Nick Land. Land’s theoretical work, broadly in the accelerationist tradition, triumphantly forecasts a ‘runaway process’ in which, under the auspices of AI, capital entirely pulls away from human life.42 In this nightmarish scenario, the whole vista of antagonisms between capital and labour would effectively vanish in the total shadow of capital’s dominance. Such ideas furnish a neoreactionary milieu that sees democracy as anathema to the smooth functioning of an automated society and that proposes to replace democratic states with CEO monarchs. As eccentric as they sound, these ideas are not peripheral to the Silicon elite but comprise, in the words of Dyer-Witheford et al., ‘part of the cultural ambience of AI.’43

Even if we resist the jubilant fatalism of Land’s protofascist nightmare, the less extreme scenario in which automation continues to work its way – however slowly – through a stagnant economy will still lead to significant human misery. It is this miserable arcadia that the supposedly more respectable sections of the Silicon elite have in mind. Perversely tasked with building this future are the workers of Appen, Playment and Mechanical Turk. Processing the data and powering the algorithms that make autonomous vehicles and smart cities possible, their implicit role is to erase their own work and that of others. The data they process powers the chatbots replacing fast-food workers, the delivery bots displacing couriers and the lights-out manufacturing set to supplant factory workers. The algorithms they oversee remove the need for supervisors and managers. Google and Facebook have been clear that the ultimate role of content moderators is to automate their own jobs away.44 In doing so, microworkers fulfil the tragic function of expediting labour’s superfluity. Already, ‘any question of the absorption of this surplus humanity has been put to rest’, Benanav and Clegg grimly note. ‘It exists now only to be managed: segregated into prisons, marginalized in ghettos and camps… and annihilated by war.’45 Now these refugees, prisoners and victims of occupation are forced into microwork by law or circumstance, to undertake the grave work of furthering the superfluity of others. The refugee in Kenya’s Dadaab camp, the inmate of a Finnish prison, the jobless Rust Belt worker, all represent the surplus humanity compelled to make more of humanity surplus.

 

Copyright

Phil Jones. “Grave Work.” Work Without the Worker: Labour In the Age of Platform Capitalism. pp. 63-80. ã 2021 Verso Books. Reprinted with permission.

 

 

Notes: Chapter 4, Grave Work

  1. For information about Scale’s automated drone services and the countries where the platform operates, see scale.com/ drones.
  2. James Bridle, New Dark Age: Technology and the End of the Future, Verso, 2019.
  3. In the edition of Capital Volume 1 used throughout this book, the passage is translated, ‘They do this without being aware of it’. See Karl Marx, Capital Volume 1, Penguin Classics, 1990, pp. 166–7. For the translation I have used above, as well as the surrounding context, see Karl Marx, Value: Studies by Karl Marx, trans. Albert Dragtedt, New Park Publications, 1976, pp. 7–40.
  4. Trebor Scholtz, Uberworked and Underpaid: How Workers Are Disrupting the Digital Economy, Polity, 2016, p. 19.
  5. Lee Fang, ‘Google Hired Gig Economy Workers to Improve Artificial Intelligence In Controversial Drone Targeting Project’, The Intercept, 4 February 2019.
  6. Ibid.
  7. Makena Kelly, ‘Google Hired Microworkers to Train Its Controversial Project Maven AI’, The Verge, 4 February 2019.
  8. Paola Tubaro, Antonio A. Casilli, and Marion Coville, ‘The Trainer, the Verifier, the Imitator: Three Ways in Which Human Platform Workers Support Artificial Intelligence’, Big Data and Society, January 2020, p. 6.
  9. See Christian Sandvig, Kevin Hamilton, Karrie Karahalios, and Cedric Langbort, ‘When the Algorithm Itself Is a Racist: Diagnosing Ethical Harm in the Basic Components of Software’, International Journal of Communication 10, 2016.
  10. Kevin Rector and Richard Winton, ‘Despite Past Denials, LAPD Has Used Facial Recognition Software 30,000 Times in Last Decade, Records Show’, Los Angeles Times, 21 September 2020.
  11. Helen Davidson, ‘Alibaba Offered Clients Facial Recognition to Identify Uighar People, Report Reveals’, The Guardian, 17 December 2020.
  12. Alex Nguyen, ‘Six Weird Crowdsourcing Tasks from Amazon Mechanical Turk’, Lionsbridge, 21 January 2019.
  13. Karen Hao, ‘The Two-Year Fight to Stop Amazon from Selling Face Recognition to the Police’, MIT Technology Review, 12 June 2020.
  14. Ibid.
  15. Kim Lyons, ‘ICE Just Signed a Contract with Facial Recognition Company Clearview AI’, The Verge, 14 August 2020.
  16. Paola Tubaro and Antonio Casilli, ‘Micro-Work, Artificial Intelligence and the Automotive Industry’, Journal of Industrial and Business Economics 46, 2019.
  17. ‘Don’t be evil’, sometimes expressed as ‘Do no evil’, is the adage that once formed Google’s code of employee conduct.
  18. Mary L. Gray and Siddharth Suri, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass, Houghton Mifflin Harcourt USA, 2019, p. 16.
  19. Ibid.
  20. Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Information and Money, Harvard University Press, 2016.
  21. Pasquale, The Black Box, pp. 3–4.
  22. Lily Irani, ‘Difference and Dependence Among Digital Workers’, South Atlantic Quarterly, 2015, 114 (1), pp. 225–34, p. 231.
  23. Naomi Klein, ‘How Big Tech Plans to Profit from the Pandemic’, The Guardian, 13 May 2020.
  24. See Amazon Mechanical Turk’s ‘participation agreement’ at mturk.com/participation-agreement.
  25. For a broader definition of ‘labour broker’, see Guy Standing, The Corruption of Capitalism: Why Rentiers Thrive and Work Does Not Pay, Biteback Publishing, 2017, p. 209.
  26. See Playment’s privacy policy at playment.gitbook.io/legal/privacy-policy.
  27. Ibid.
  28. Niels Van Doorn and Adam Badger, ‘Platform Capitalism’s Hidden Abode: Producing Data Assets in the Gig Economy’, Antipode 52(5), 2020, p. 1477.
  29. For this argument, see Moritz Altenreid, ‘The Platform as Factory: Crowdwork and the Hidden Labour behind Artificial Intelligence’, Capital and Class 44(2), 2020.
  30. Huizhong Wu, ‘China Is Achieving AI Dominance by Relying on Young Blue-Collar Workers’, Vice, 21 December 2018.
  31. Ibid.
  32. ‘China’s Success at AI Has Relied on Good Data’, Technology Quarterly, The Economist, 2 January 2020.
  33. A. Aneesh, ‘Global Labour: Algocratic Modes of Organisation’, Sociological Theory 27(4), 2009.
  34. On the ways cross-subsidisation is used throughout platform capitalism as a tool for data extraction, see Nick Srnicek, Platform Capitalism, Polity, 2016, pp. 61–2.
  35. Malcolm Harris, ‘The Singular Pursuit of Comrade Bezos’, Medium, 15 February 2018.
  36. Kim Moody, ‘Amazon: Context, Structure and Vulnerability’, in Jake Alimahomed and Ellen Reese, eds, The Cost of Free Shipping: Amazon in the Global Economy, Pluto, 2020.
  37. Srnicek, Platform Capitalism, p. 62.
  38. See Amazon Web Services, ‘Global Infracture’, at aws.amazon.com/about-aws/global-infrastructure.
  39. Richard Seymour, The Twittering Machine, Verso, 2020, p. 23.
  40. See Russell Brandom, ‘Google, Facebook, Microsoft and Twitter Partner for Ambitious New Data Project’, The Verge, 20 June 2018. See also Alex Hern, ‘“Partnership on AI” Formed by Google, Facebook, Amazon, IBM and Microsoft’, The Guardian, 28 September 2016.
  41. Jason E. Smith, ‘Nowhere to Go: Automation, Then and Now Part 2’, Brooklyn Rail, April 2017.
  42. Nick Land, ‘A Quick and Dirty Introduction to Accelerationism’, Jacobite, 25 May 2017.
  43. Nick Dyer Witheford, Atle Mikkola Kjøsen, and James Steinhoff, Inhuman Power: Artificial Intelligence and the Future of Capitalism, Pluto, 2019, p. 157.
  44. Davey Alba, ‘The Hidden Laborers Training AI to Keep Hateful Ads off Youtube Videos’, Wired, 21 April 2017.
  45. ‘Misery and Debt’, Endnotes, April 2010.