Silent march to end stop and frisk and racial profiling,” Long Island Wins

April 1, 2019; Next City

In an excerpt published in Next City from her book BIASED: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do, Stanford social psychologist (and MacArthur Fellow) Jennifer Eberhardt delves into the impact of implicit bias in perpetuating segregation and racial discrimination. More than half of whites, Eberhardt explains, say they would not move to an area that is more than 30 percent black, because they “believe that the housing stock would not be well maintained and crime would be high.”

More broadly, Eberhardt writes, studies by sociologists Lincoln Quillian and the late Devah Pager show that “the more Blacks there are in a community, the higher people imagine the crime rate to be—regardless of whether statistics bear that out.” Eberhardt also cites the work of Robert Sampson and Stephen Raudenbush, who have found that the more Blacks there are in a neighborhood, the more disorder people see, even when measurable signs like graffiti, boarded-up houses, or garbage in the street don’t differ. Eberhardt adds that, “Black people are just as likely as whites to expect signs of disorder in heavily Black neighborhoods.”

As NPQ has noted, technology often embeds these biases in algorithms. For example, many advocates of eliminating cash bail in California, including the California chapters of the American Civil Liberties Union and the NAACP, dropped their support of legislation to ban cash bail once the bill was amended so that the decision on whether to detain or release a person would be based on “an assessment algorithm to create an individual ‘risk score’ that supposedly reveals the likelihood of re-arrest or failure to appear in court if released.” As Sam Levin of the Guardian explains, “Because the data comes from a criminal justice system that has documented discrimination at every step—including racial biases in police stops, searches and arrests,” the algorithms would likely reinforce existing racial inequities in the state’s criminal justice system.

That said, as NPQ’s Jeanne Allen has pointed out, algorithms can be used to mitigate biases if there is “an intentional focus when developing, buying, or adapting data systems.”

In her excerpt, Eberhardt profiles one such example from Nextdoor, a social network that aims to help people “feel comfortable connecting with neighbors they’ve never met.” Often the network is used for conventional needs, such as finding a lost dog, getting babysitter recommendations, and so forth. But the firm found that the software was also being used to “warn” neighbors about “a stranger who seems out of sync with the prevailing demographic.”

The firm therefore changed its posting process to discourage this kind of use of the platform. As Eberhardt explains, “The posting process was changed to require users to home in on behavior, pushing them past the ‘If you see something, say something’ mindset and forcing them to think more critically: if you see something suspicious, say something specific.”

To curb racial profiling, the firm, Eberhardt explains, developed a checklist of reminders that users have to click through before they can post under the banner of “suspicious person.” Specifically, this included prompts to:

  • Focus on behavior. What was the person doing that concerned you, and how does it relate to a possible crime?
  • Give a full description, including clothing, to distinguish between similar people. Consider unintended consequences if the description is so vague that an innocent person could be targeted.
  • Don’t assume criminality based on someone’s race or ethnicity. Racial profiling is expressly prohibited.

The idea behind the prompts is to slow down rash, sometimes unconscious, thinking that leads to racial profiling behavior. The prompts did not eliminate racial profiling, but profiling, the firm says, fell more than 75 percent. Eberhardt adds that, “They’ve even adapted the process for international use, with customized filters for European countries, based on their mix of ethnic, racial, and religious tensions.”

Sarah Leary, a cofounder of Nextdoor, says she is hopeful that with appropriate safeguards, the platform can serve its intended purpose of building connection.

“There’s a whole canopy of examples of people’s lives that are maybe more similar to yours than you assume,” Leary tells Eberhardt. “When you have direct connections with people who are different from you, then you develop an ability to recognize that.” Or, as Eberhardt puts it, the “scary” Black teenager in the hoodie becomes “Jake from down the block, walking home from swim team practice.”—Steve Dubb