August 30, 2016; Atlantic, Quartz, and Gizmodo
Last week, Quartz reported that Facebook had axed its “news curation” team, the group of editors responsible for the content of the platform’s Trending section, after this spring’s allegation of biased storytelling within the feature. Without the human team, the reasoning seemed to be, Facebook could offer a truer, non-editorialized look at Facebook’s most-talked-about topics. Of course, two days after its human input ended, Trending Topics was driving traffic to a fake story about Fox News’ Megyn Kelly.
Facebook’s Trending section, which has been “curated” by a team of journalists since its launch in 2014, was criticized earlier this year when Gizmodo reported that the news curation team consistently suppressed conservative topics and articles. Facebook said that algorithms determined which stories appeared in Trending Topics, and that the editors compiling those stories worked under guidelines “to ensure consistency and neutrality.” The company stated that it conducted an investigation and found “no evidence” that the claims of bias were true. Facebook CEO Mark Zuckerberg met with conservative commentators and columnists to discuss its practices and assure balance in its news curation.
Despite the results of Facebook’s internal investigation, the company decided to ditch the news curation team, and, with it, the obviously-written-by-humans headlines attached to Trending topics. Now, topics appear as a single word or phrase: In a past iteration of Trending, users might have seen the headline, “Anthony Weiner reportedly caught in another sex scandal,” while users this morning simply see only the ominous link text “Anthony Weiner” along with an estimate of the number of people talking about the infamous Carlos Danger. The new Trending feature pulls excerpts from news articles to appear as mouseover text for the links.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
Thus, when users hovered over the words “Megyn Kelly” on Sunday, Facebook provided text alleging that the news anchor had been fired for supporting Hillary Clinton. The text was a headline pulled from endingthefed.com, which had run the (patently false) article. Quartz reported that while Facebook’s engineers are working to ensure that Trending topics are newsworthy (and real), some topics clearly slip through, leading to questions on the efficacy of the recent changes.
Facebook isn’t alone in the struggle to keep its trending topics fair. Twitter also serves up Trends based on algorithms tied to user interests, organized by human curators. It was accused earlier this year of suppressing an anti-Clinton hashtag ahead of the Democratic primaries, but the story failed to make waves of the same magnitude as the Facebook bias allegations. Perhaps more objectionably, Twitter actually allows marketers to purchase space in the Trends section. Facebook hasn’t yet allowed “promoted trends,” but the concept seems far from impossible.
Even without human editors and headline-writers, Trending is unlikely to be a perfectly unbiased platform. Facebook’s selection system serves stories to individual users based on demographics and user activity, so users see what Facebook thinks they want. Plus, algorithms responsible for serving up stories are, of course, man-made—bias is inherently embedded into these systems through language and design. Sociologist Zeynep Tufekci wrote about Facebook’s news algorithms in May, saying, “While these algorithms also use data, math and computation, they are a fountain of bias and slants—of a new kind,” and opining that Facebook should “drop the pretense that they are neutral.”
In short, Trending’s new, human-free approach may cut the bias of a handful of editors, but it doesn’t strip the platform of the influence algorithms—or its users. Trending is, of course, supposed to serve up what Facebook’s collective mass is handing it: Can we be surprised that the mix includes occasional fake stories? This is what human discernment is good for.—Lauren Karch