Wednesday, 17 May 2017

Facebook’s Fact Checking Can Make Fake News Spread Even Faster









































After acknowledging that it has a problem with fake news, Facebook introduced a feature recently that flags certain posts as "disputed." In some cases, however, this appears to be having the opposite effect to the one Facebook intended.
According to a report by The Guardian, the tagging of fake news is not consistent, and some stories that have been flagged continue to circulate without a warning. In other cases, traffic to fake news posts actually increased after Facebook applied the warning.
Facebook started rolling out the new feature last month, as part of a partnership with a group of external fact-checking sites, including Snopes.com, ABC News, and Politifact.
When a user tries to share links that have been marked as questionable, an alert pops up that says the story in question has been disputed. The alert links to more information about the fact-checking feature and says that "sometimes people share fake news without knowing it."If the user continues to share the link or story anyway, the link is supposed to appear in the news-feeds of other users with a large note that says "disputed," and lists the organizations that flagged it as fake or questionable.
The idea behind the effort was to try to decrease the visibility of hoaxes and fake news, which many Facebook critics believe are spread rapidly by the site's news-feed algorithm.

No comments:

Post a Comment