By Seattle Municipal Archives from Seattle, WA (Fair housing protest, 1964–Uploaded by Jmabel) [CC BY 2.0 ], via Wikimedia Commons

August 22, 2018; San Antonio Current

“Algorithmic injustice.” It’s a turn of phrase that may for some conjure memories of a pop quiz in math class, but, in the present, it represents a significant battle over the future of social media fairness and accountability.

With Facebook approaching 2.25 billion users and some $40 billion in advertising revenues in the US alone, concerns about its influence have triggered the engagement of social justice organizations. The issue has grown beyond better-known concerns like Russian interference in the 2016 presidential election and into the broader domain of discrimination and fairness. Previous NPQ articles have reviewed concerns about the balance between philanthropic endeavors and control of the internet, the way that social media data is being controlled, and last November’s Congressional Black Caucus, which raised concerns about Facebook’s “Ethnic Affinities” advertising features. It is this very type of “filtering” that is making news in the courts, with pending decisions that have the potential to forever change the connection between social media content and provider accountability.

The National Fair Housing Alliance, Fair Housing Justice Centre, Housing Opportunities Project for Excellence, and Fair Housing Council of Greater San Antonio are collectively pursuing a court action against Facebook. The suit accuses the social media giant of knowingly empowering landlords and property sellers to run ads that unjustly filter out applicants:

Facebook’s advertising platform enables landlords and real estate brokers to exclude families with children, women, and other protected classes of people from receiving housing ads…after being warned repeatedly about its discriminatory advertising practices, Facebook continues to use this data to deny people access to rental housing and real estate sales ads because of their sex and family status.

ProPublica investigated in 2016 and found that Facebook’s ad platform permitted advertisers for a variety of goods and services (including housing) to exclude African Americans, Latinxs, and Asian Americans from receiving ads. According to the housing advocacy plaintiffs, “while Facebook has recently removed some of these options, it continues to violate fair housing laws that prohibit discrimination in other ways.”

Although the legal, technological, and sociological complexities of these issues cannot be overstated, the broad battle lines being drawn over rental housing are become clear: It is about deciding if a social media platform like Facebook is an “interactive computer service” or a “content provider.” The distinction is incredibly important, and that is why the suit by the housing agencies, launched back in March, has now generated both an administrative complaint from the US Department of Housing and Urban Development (HUD), as well as a filing by the US Justice Department (DOJ) in direct support of their complaint.

The DOJ preliminary statement addresses the issue head on: “Facebook’s argument that the Communications Decency Act…immunizes it from the FHA [Fair Housing Act] rests on the faulty premise that it is merely an interactive computer service. To the contrary, the Complaint sufficiently alleges that Facebook is an internet content provider and that it may be held to account for that content.”

The HUD complaint is more blunt, and although not attached to any specific court action, clearly establishes support for the housing alliance viewpoint: “The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse,” said Anna María Farías, HUD’s Assistant Secretary for Fair Housing and Equal Opportunity. “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”

Facebook is clearly feeling the pressure. Although the lawsuit, DOJ’s participation, or the HUD complaint are not mentioned, the tone and timing of Facebook’s August 21 public memo seems clearly designed to demonstrate that they are listening and responding.

We’re committed to protecting people from discriminatory advertising on our platforms. That’s why we’re removing over 5,000 targeting options to help prevent misuse. While these options have been used in legitimate ways to reach people interested in a certain product or service, we think minimizing the risk of abuse is more important. This includes limiting the ability for advertisers to exclude audiences that relate to attributes such as ethnicity or religion.

The Facebook memo stops short of taking ownership for facilitating discriminatory practices, and it is clear that it is going to take enforcement, not suggestion, before Facebook accepts content responsibilities more akin to what a newspaper or public broadcaster would deliver.

Current court actions against Facebook in some ways mirror battles led by nonprofit organizations against media companies in decades gone by. In 1989, the Open Housing Center brought a successful court action in the form of a New York Times settlement whereby they announced policy changes that spread throughout the industry.

In essence, the Facebook defense in 2018 replicates the failed argument of the Times in 1989 that they “merely published the advertisements as submitted.” The current housing alliance suit contends that Facebook cannot dismiss the results of their filtering options by defending them as choices made by advertisers and/or that Facebook is not accountable for filtering results used for purposes of discrimination.

It might seem unfair to have the same expectations of social media platforms as compared to the relatively simple operations of a newspaper, but consider that the New York Times receives and publishes advertising in 2018 in much the same manner as Facebook—it comes in and goes out through digital technology that does not require humans—but the Times has adopted technological and human safeguards that Facebook has not.

Steph Jespersen, the Times’ director of advertising acceptability, explained to ProPublica that the company’s staff runs automated programs to catch any ads that contain discriminatory phrases such as “whites only” or “no kids.” But machines don’t run the whole show—even potentially discriminatory words or phrases like “near churches” or “close to a country club” are reviewed by humans to make sure that they are not attempts at an end run around their terms of use.

Speaking on NPR in Austin, Glenn Grossenbacher, a San Antonio attorney familiar with nonprofit housing groups and issues, said that the HUD and DOJ actions signaled “a good day for those plaintiffs.” In addition to the DOJ weighing in against the motion to dismiss the suit, Grossenbacher said the government actions address a frequent Facebook tactic.

“Basically, they can no longer say ‘If this is such a big deal, why isn’t the government doing something about it?’” said Grossenbacher. “Because now they have.”

The efforts of these nonprofit housing organization to take on Facebook may read like a David and Goliath story, but their social justice savvy has been backed up by technological prowess to make effective connections between Facebook filters and discriminatory practices.

David Berman, an attorney directly involved in the case on behalf of the Fair Housing Council of San Antonio, told the San Antonio Current that all four plaintiffs were able to generate their own deliberately discriminatory advertisements, all of which were approved by Facebook in a matter of minutes.

“The capacity is there,” Berman said. “It would just take one bad actor with a lot of properties to affect hundreds of thousands of apartments.”

With the additional support from DOJ and HUD, Berman says the nonprofit coalition is feeling confident that “the court will find that Facebook did violate the law.”

It might be time for Facebook to have a conversation with the New York Times and come up with a similar human-involved solution, because the argument of “we’re just an interactive computer service” is not an improvement on the similar “disengaged” arguments of 1989.

This story is an encouraging example of the relevance of the social justice community in the digital age. These organizations are showing impressive leadership and proving highly effective at asserting human rights in the social media world, fueled by the same passion for inclusion and fairness that they’ve traditionally brought and continue to bring to local communities.—Keenan Wellar