Facebook page / reynermedia

November 15, 2016; Wall Street Journal

Many people are playing the blame game in light of Donald Trump’s electoral victory, and while it seems no one has been spared, Facebook is taking a particularly hard hit. Critics argue the social media platform may have played an unintended role in the election by controlling political discourse with newsfeed filters, essentially creating a “filter bubble” in which people only see posts from like-minded friends. Further, and more importantly, critics say that fake news stories posted on Facebook swayed votes away from Democratic candidate Hillary Clinton.

The Wall Street Journal says, “The algorithms underlying the news feed reward posts that drive a lot of engagement, in the form of shares, likes and comments. But that same formula encourages fake news, hoaxes and misinformation, according to critics, which include former and current Facebook employees, who are openly disturbed by the election of Mr. Trump.”

The overall argument seems to be that Facebook algorithms showed people what they wanted to see and perpetuated fake news sites, thus people voted for Trump. Facebook founder and CEO Mark Zuckerberg has vehemently denied these claims, calling the idea “pretty crazy.” Further, he says, “Why would you think fake news would be on one side, but not on the other?” According to the Wall Street Journal, he expressed that these arguments show a lack of understanding for why people voted for Trump.

But, perhaps this isn’t such a far-fetched claim. It is fairly common knowledge that you can’t trust everything you read on the Internet, but what if your friend posts it? This can lend credibility to a fake news article, particularly if your opinion of this friend is that they are politically savvy. Snopes.com, a website created to sniff out rumors spread on the web, recently posted a list of the top 25 viral news stories that actually weren’t true. Of these, 19 were related to presidential candidates Trump and Clinton.

Social media in general played an unprecedented role in this election, with candidates even directly communicating with voters through Facebook and Twitter. At the end of the day, however, Facebook was created to be social in nature, hence the term “social media.” Although not intended as a source of news, the reality is that for some people, this was how they got their election information. In fact, a Pew Research Study indicates that the majority of adults in the U.S., 62 percent, get at least some of their news from social media.

While Zuckerberg may deny that Facebook had a role in this election, a group of his employees have apparently formed an unofficial committee to look into the claims. One employee said:

There is a lot more we could be doing using tools already built and in use across Facebook to stop other offensive or harmful content. We do stop a lot of people form posting nudity or violence, from automatically flagging certain sites to warning people who post content that doesn’t meet the community guidelines. If someone posts a fake news article, which claims that the Clintons are employing illegal immigrants, and that incites people to violence against illegal immigrants, isn’t that dangerous, doesn’t that also violate our community standards?

Flagging sources of fake news is a start. But, at the end of the day, users of social media need to be careful about what they post or repost. This is in part because they can spread fake news, but also because their reputations are on the line. Users whose posts are questionable or who pick fights with other users may lose credibility amongst friends and eventually see a drop in the size of their network, as we have seen during this election with the so-called “purge” phenomenon. Social media stands as an extension of an individual; in some ways, you are what you post. The responsibility is not only on Facebook but also its users to ensure social media remains a positive asset in our lives.—Sheela Nimishakavi