Engagement With Fake News On Facebook Tripled Since 2016: Report

A handful of deceptive websites were found to attract most of the interactions on the platform.

Engagement with news outlets on Facebook that post verifiably false news or manipulated content has tripled since the run-up to the 2016 United States presidential elections, revealed a recently published study.

The study, conducted by German Marshall Fund (GMF) of the United States, a Washington D.C.-based policy think tank under the Digital New Deal project, shows how despite Facebook's repeated efforts to tackle misinformation, user engagement with such content shows no signs of abating.

This is yet another study among many conducted in the past year that highlight the role of Facebook algorithm in promoting misinformation to drive up user engagement.

Deception On The Rise

GMF conducted the study in partnership with NewsGuard - a tool that rates news websites for their reliability, and social media intelligence firm NewsWhip which provided data on the spread of articles from deceptive websites.

Such websites were identified and classified using ratings provided by NewsGuard (which ranks news sites based on how they uphold nine journalistic principles) into the two following categories:

  • False Content Producers - Sites that repeatedly published content that is provably false, as per NewsGuard
  • Manipulators - "Sites that NewsGuard determined failed to gather and present information responsibly— by presenting claims that are not supported by evidence, or egregiously distort or misrepresent information to make an argument, but without crossing NewsGuard's threshold of repeatedly publishing false content"

The study found that Facebook likes, comments and shares of articles from False Content Producers grew by 102% between the run-up to the 2016 US presidential elections till present - which marks a threefold growth. Additionally, similar interactions with content from Manipulators - such as Fox News, Daily Wire and Breitbart - grew by 293% during the same period, the study found.

"Disinformation is infecting our democratic discourse at rates that threaten the long-term health of our democracy. A handful of sites masquerading as news outlets are spreading even more outright false and manipulative information than in the run-up to the 2016 election."

- Karen Kornbluh, Director, German Marshall Fund Digital

Stopping The Handful

The study also revealed that only a handful of highly popular deceptive websites were responsible for garnering a majority of the engagement. The top 10 out of 1000 websites assessed in the study received 60% of all the likes and comments.

Based on the findings of the study, the group noted that "de-amplifying—or adding friction to—the content from a handful of the most dangerous sites could dramatically decrease disinformation online".

In response to the study, Andy Stone, a Facebook spokesman told the New York Times that analysing likes, shares and comments to draw conclusions was "misleading" because such data does not represent what most users see on the platform.

While Facebook has taken repeated measures to combat the spread of misinformation and conspiracy theories, researchers have called into question the algorithm which seemingly promotes fake news.

Earlier this year, the Wall Street Journal reported that Facebook employees had complained to the company's executives the algorithm used by the platform spread more divisive content than bring people together.

In August, the Tow Center for Digital Journalism at Columbia Journalism School identified a network of over 1,200 dubious "local news websites" aimed at promoting partisan talking points and collect user data. These sites were found to be owned by five different organisations which lead back to a single businessman.

In the same month, a report by advocacy group Avaaz found that health-related misinformation peaked during the COVID-19 pandemic. The group noted that Facebook's efforts in minimising the spread of such health-related misinformation were outperformed by the amplification of such content by Facebook's own algorithm.

Note: BOOM is one of Facebook's third-party fact checkers in India.

Updated On: 2020-10-16T12:36:46+05:30
Show Full Article
Next Story