Given the fact 44% of Americans get their news from social media sites like Facebook Inc. (NASDAQ:FBB) and Twitter Inc. (NYSE:TWTRC) or from Alphabet Inc.’s (NASDAQ:GOOGB) Google, it stands to reason if much of what is posted is not true, many of those people could have voted based on false information.
Mark Zuckerberg has pushed back against the notion Facebook’s news feed contains a lot of false or hoax stories saying in a statement, “Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.”
Sources Say Facebook Is Worried
Monday Gizmodo reported that despite Zuckerberg’s denial that there is a problem, Facebook has struggled internally since May regarding how the platform vets the news reports it runs. According to Gizmodo, the debate is over whether Facebook has a duty to prevent misinformation from being shared with its users.
Sources told Gizmodo that Facebook had created a News Feed update that would have identified fake or hoax news stories, but that it disproportionately impacted right-wing news sites. Because of this, the source said, Facebook shelved the update.
Facebook sent an email to Gizmodo saying it “did not build and withhold any News Feed changes based on their potential impact on any one political party.”
The company has, however, made it possible since January 2015 for users to self-report fake stories on their feeds. At that time Facebook said, “The strength of our community depends on authentic communication. The feedback we’ve gotten tells us that authentic stories are the ones that resonate most. That’s why we work hard to understand what type of stories and posts people consider genuine — so we can show more of them in News Feed.”
Sorting Out The Truth
Slate noted that the fact that the number of “sources” (2) for the Gizmodo story creates some question, although the notion that Facebook undertook a review to eliminate the appearance of bias isn’t controversial by itself.
The second claim that Facebook refused to release a fix that ended up negatively affecting conservative stories was only reported by one source. That, Slate said, weakens that argument and leaves room for skepticism.
Pressure Is On
None the less, Facebook, Twitter and Google are under growing pressure to ensure news content posted on their platforms is accurate. They walk a fine line between policing content and alienating users.
Karen North, director of social-media studies at USC, speaking with the Wall Street Journal asked, “Do you really want Facebook and Twitter deciding what you can talk about?”
At the same time, if users decide content on one site is less accurate than on another site, they may migrate to the second site, causing the less truthful site millions of dollars in ad revenue.
Solutions So Far
To date Facebook and Google have pledged to bar news sites they believe are distributing fake stories from using their ad services. Twitter has said it would allow users to block notifications of tweets that include specific words. This is designed to combat harassment on the Twitter platform.
While Facebook hasn’t said how it would vet sites, Google does have an established program for dealing with news organizations that want to appear in Google News.