February 20, 2018, Undark

In a thinkpiece in Undark, University of California Hastings Institute for Innovation Law research fellow Nick Thieme notes that, “It is a grim truism of modern life that everything from civil rights violations and health crises to environmental degradation and educational barriers are disproportionately suffered by the people least financially and socially equipped to deal with them.”

Thieme points out that, in similar fashion, computer algorithms often embed inequality. According to Thieme, “a bitter fight is emerging” as the implications of these algorithms become more widely felt and understood.

Thieme explains,

At issue is whether a society now indivisibly dependent on computer technology and its underlying programming can ensure that its vast benefits and inevitable burdens will be distributed equally across social and economic classes. If some version of this egalitarian principle, which I call “computational justice,” does not soon become commonplace, we run the risk of hard-coding all manner of injustice for generations to come.

Thieme points out that, “In 2016, mathematician and former investment banker Cathy O’Neil explored the idea that algorithms in the economic, financial, and educational realms contribute to the structural effects that maintain divisions in wealth.” O’Neil called the resulting inequitable algorithms “weapons of math destruction.”

As Thieme observes, “Class affects how an algorithm applies to different people, but it also affects which algorithm applies to different people. For example, being wealthy means algorithms will find you vacation homes on Airbnb; being homeless means robots will move you if you sleep too close to buildings. Algorithms find work for the well-educated while taking it away from those without education.…Without oversight, the opportunities for injustice are abundant.”

The phrase “computational” or “algorithmic” justice may not role off the tongue, but for years, neither did the phrase “environmental justice.” As Thieme observes, “It’s now well documented that low-income areas fare worse than wealthy ones when natural disasters strike, as when Hurricane Harvey hit Texas, Katrina hit New Orleans, or Maria hit Puerto Rico.”

Thieme points out that precisely because artificial intelligence is centered on finding patterns, where discrimination already exists in society, algorithms can, if applied poorly, lock in those discriminatory patterns. Nowhere are the dangers more evident than in policing. In Baltimore, after three consecutive years of more than 300 annual homicides, the city police implemented algorithms that seek to identify “the where, when, and who” of crime—a high-tech version of the hot spot policing that critics say disproportionately targets minorities and lower-class communities. Whether predictive policing is effective is subject to debate, however. In Science, Mara Hvistendahl notes that John Hollywood, an analyst for RAND Corporation in Arlington, Virginia, co-authored a report on the issue. Hollywood’s conclusion was that the public safety gains from predictive policing are “incremental at best.”

Thieme concedes that, so far, no movement for democratic controls over who codes the algorithms and toward what ends has broken out. Still, there are signs of interest. For example, last December in New York City, the City Council passed a law that creates “a task force to review New York City agencies’ use of algorithms and the policy issues they implicate. The task force will be made up of experts on transparency, fairness, and staff from non-profits that work with people most likely to be harmed by flawed algorithms. It will develop a set of recommendations addressing when and how algorithms should be made public, how to assess whether they are biased, and the impact of such bias.” Nationally, the Electronic Frontier Foundation has begun to actively advocate on these issues.

Thieme lauds these measures, but says greater engagement is required: “We need to recognize computational justice as a virtue and create standalone structures dedicated to actively defending it.”—Steve Dubb