September 9, 2016; Washington Post
Facebook received another barrage of criticism last week over charges of censorship, adding more bumps to the road traveled by the preeminent social media platform on its ride to global dominance. As of the afternoon on Friday, September 9th, more than 40,000 people were talking about the “Napalm Girl Protest” on Facebook after the historic photo was removed by moderators due to its content and #NapalmGirl was trending on Twitter. (We won’t comment about this turn of phrase, which carries its own discomfort.)
Facebook was quickly forced to reverse course and allow the image on its platform, according to the BBC:
The tech giant said it had “listened to the community” and acknowledged the “global importance” of the photo.
“Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed,” it said in a statement.
“It will take some time to adjust these systems but the photo should be available for sharing in the coming days.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
The controversy began several weeks ago after Norwegian author Tom Egeland shared an iconic photograph of the Vietnam War—“The Terror of War,” by Nick Ut—on his Facebook page. Egeland was notified that the post violated the site’s nudity policy and was subsequently removed. The image, which features a young girl running from a napalm attack, won the Pulitzer Prize in 1973.
Then, the story gained global attention when a letter to Facebook CEO Mark Zuckerman from Aftenposten newspaper editor Espen Egil Hansen went viral and Norway’s Prime Minister, Erna Solberg, spoke out against what she described as censorship.
As social media platforms gain users, companies like Facebook are struggling to balance the technical side of content creation with human idiosyncrasies that can defy algorithms. The issue came to a head this spring, as NPQ looked at whether Facebook was manipulating the news after former contract workers accused the company of political bias. But despite the ongoing controversies, nearly half of users said they were comfortable with social media companies controlling what news appears on their sites and Zuckerberg continues to insist Facebook is a tech, not media, company. So, even as accusations of censorship and bias increase, there’s no indication that Facebook will change its practice and standards.
NPQ has argued that as Facebook’s organic traffic disappears, nonprofits pay the price. Egil Hansen makes a case along similar lines, following Facebook’s policy reversal. And his words, written in Norwegian on Aftenposten’s site, are a call to action for every citizen to serve as a watchdog to today’s Big Brothers.
When it comes to this photo specifically I would say that it was a sensible decision by Facebook. That’s what we editors have to do sometimes—realize that we made a mistake and change our minds. But the main point of my article, and the point that I have asked Mark Zuckerberg to engage in, is the debate about Facebook’s power that results from so much information going through its channels. And that still stands. He should begin to take part in this discussion, for there are no simple solutions. Facebook must recognize that it has become an information filter—and that raises problematic issues.