“Even before the global pandemic drastically increased reliance on communications technology for working, learning, shopping, and socializing at a distance, Americans from all walks of life reported a growing unease about the impact of technology companies.” So begin Yeshimabeit Milner and Amy Traub in a report published by Data for Black Lives (D4BL) and Demos this spring titled Data Capitalism + Algorithmic Racism.
Milner and Traub cast a critical eye toward what they call “data capitalism.” Data capitalism, they explain, commodifies data and uses “big data and algorithms [emphasis in original] as tools to concentrate and consolidate power in ways that dramatically increase inequality along lines of race, class, gender, and disability.”
The rise of “big data” may be new, but struggle over the control of data and technology is nearly as old as capitalism itself. For example, take the case of the Luddites, the English workers who famously smashed machines in protests in the early 1810s. Today, the term “Luddite” is taken to mean someone who opposes technological progress. But as Richard Conniff observed in Smithsonian Magazine, what was actually being contested at the time had little to do with efforts to stop technological change and everything to do with ensuring that technological change benefited workers.
“The original Luddites were neither opposed to technology nor inept at using it. Many were highly skilled machine operators in the textile industry,” writes Conniff. The goals, notes Kevin Binfield of Murray State, who has written extensively on the Luddites, were to enable workers to preserve production quality and to receive just compensation. As Binfield explained to Conniff, “They just wanted machines that made high-quality goods, and they wanted these machines to be run by workers who had gone through an apprenticeship and got paid decent wages.” They sought, as Scottish writer Thomas Carlyle would later put it, “a fair day’s wages for a fair day’s work.”
The Luddites, clearly, did not prevail; wages for factory workers would remain low for decades while a few capitalist factory owners profited enormously. However, ultimately British workers did succeed in extracting benefit from technology change as they successfully organized in trade unions later in the nineteenth century.
As Milner and Traub emphasize, elite control of technology and information has not just been used to reinforce capitalist control over factories and technology. It has also very explicitly been used to bolster white supremacy. For example, they observe that the complex information tracking systems developed under slavery “were designed to extinguish networks that would have allowed enslaved people to rise up together against their captors and the economic system that entrapped them.” Examples of data manipulation being employed, post-slavery, to reinforce structural racism are widespread as well, ranging from insurance to redlining in real estate, to name just two areas. The number of cases, note Milner and Traub, are “countless.”
Data Capitalism Today
The struggle to control data and technology may be centuries long, but the rise of information and communications technology raises the stakes. Data capitalism, Milner and Traub explain, “deploys surveillance, data extraction, monetization of data, and automated decision-making to consolidate power in the hands of corporations and the wealthy, exacerbating racial and economic inequality.”
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
And it is omnipresent. Milner and Traub point to a survey of 239 large corporations between 2015 and 2018 that found that half of them engaged in widespread surveillance, with the trendline suggesting that tracking of company employees is becoming increasingly common.
Nowhere is this more obvious than in the so-called gig economy, where platform companies like Uber and DoorDash employ algorithms that “manage and discipline workers as closely as any supervisor or boss.” Such algorithms in their mathematical formulas can freeze into place assumptions rooted in structural racism and economic inequality. As Milner and Traub emphasize, commonly found in the lines of code “are legacies of racist public policy and discrimination dating back to the foundation of this country.”
What is the alternative? Miller and Traub call for “building mechanisms for collective consent and democratic control over data and algorithmic decision-making.” And, to do so, they focus on four specific policy areas—data transparency, data use regulation, structural changes to markets, and changes to overall data governance.
Dismantling Data Capitalism: Some Initial Action Steps
Milner and Traub caution that what they offer in their report is a survey of policy options and “is not intended to be comprehensive or to constitute a single, unifying set of policy recommendations.” Nonetheless, the policy options laid out provide a useful road map to current areas of economic and political struggle.
- Data transparency: Policies highlighted in the report include those which would enable individuals to access “data reports” (much like credit reports) and require employers to disclose to employees any surveillance they employ. Similar proposals for algorithms would require employers to inform workers as to what algorithms are measuring in terms of performance and require the code of algorithms used by governments to be fully public, so that the terms may be fully subjected to public debate.
- Regulatory reforms: Provisions highlighted here include measures that would establish individual rights to access, correct, delete, and move personal information; to prohibit the use of personal data to discriminate; and to establish the right of criminal defendants and their lawyers to access the software used to collect data against them so they can more effectively challenge erroneous charges. There are also measures that would restrict or ban the use of certain types of data, including establishing limits on employer collection of worker data and limits on the use of surveillance data by police.
- Structural changes: This includes a host of antitrust measures (including breaking up large tech companies and banning the use of “self-preferencing” in search engines) as NPQ covered. Another potential structural change would be proposals to ban or limit (tax) targeted advertising, the primary revenue source of large tech firms such as Google and Facebook. The authors also advocate changing the law to reclassify gig workers who currently labor for platform companies (such as Uber car drivers or Instacart deliverers) and are presently treated as “independent contractors” responsible for paying for their own benefits as employees.
- Governance: The most far-reaching changes surfaced in the report fall in this category. Among the proposals highlighted are public ownership of broadband networks (an example of which exists in Detroit, Michigan), support for worker-owned platform co-ops, and data trusts to oversee the use of data collected through computer networks. According to Milner and Traub, “A data trust is a structure where data is placed under the control of a board of trustees with a fiduciary responsibility to look after the interests of the beneficiaries—you, me, society. Using them offers all of us the chance of a greater say in how our data is collected, accessed, and used by others.”
Toward a Positive Vision of Democratic Data Management
To date, as cellphones, social networks, and online shopping have become ubiquitous, the power of companies that have access to platform data has increased exponentially. Yet Milner and Traub insist that a more democratic system of data management is both possible and necessary. D4BL, they write, “is based on a very simple idea: that any technology is rendered invalid without the trust, consent, and collaboration of the community and the people directly impacted.” They add that, “more equitable outcomes are possible when activists, advocates, and policymakers push for and win greater transparency, regulation of harms, key structural changes to industry, and genuine shifts in the governance of data.”
Beyond achieving policy changes, what might that look like? In the report, the authors cite one Detroit resident from a 2018 report. The respondent, identified only as “Ollie Mae,” offered the following vision:
“The changes I would make would be to have data that is intentional and targeted and centering people in the middle of those decisions. So, data would be created for the people and with people as opposed to on people and against people. Data would be done in a way that is an instrument and a tool to support their uplift and the uplift of their consciousness and the quality of their lives. It would be used to map and visualize so that people’s understanding was centered in coming together and being their own solutionaries.”