Facebook report uncovers how fake news spread amid the US election

A new study from the social network has detailed how different types of fake news spread during the campaign


Facebook has detailed how fake news infiltrated its platform during the US presidential election campaign, and outlined how it plans to safeguard against similar attempts in the future.

The social network's latest report explains how it has had to “expand our security focus” from just blocking hacking, malware and spam attacks, to cover “more subtle and insidious forms of misuse” – such as spreading misinformation to manipulate civic discourse and deceive people.

It categorised different types of misinformation as “false news”, “disinformation” and “information operations” – which respectively are incorrect stories claiming to be factual, incorrect content purposefully spread and state-sponsored campaigns to distort public sentiment.


Facebook defined this last one as: “actions taken by organised actors (governments or non-state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome”.

It said it could not categorically identify a government behind this, although the US has accused Russia of hacking its elections.

There are three main ways Facebook has observed the spread of false information happening – through targeted data collection, content creation and false amplification.

To fight targeted data collection, Facebook will provide a set of customisable security and privacy features, including notifications to specific people if they have been targeted, proactive notifications to people who have yet to be targeted but are at risk, direct communication with likely targets and also working with governmental bodies responsible for election protections to notify and educate people who may be at greater risk.

The US election

During the US election, Facebook responded to “several situations” of fake news or misinformation on its platform, and explained how hackers stole private information from people's email accounts before creating fake profiles on Facebook to share this data, creating pages to direct people to it.

“From there, organic proliferation of the messaging and data through authentic peer groups and networks was inevitable,” the report read.

It also said how there was a set of inauthentic Facebook accounts that pushed “narrative and themes that reinforced or expanded on some of the topics exposed from stolen data”. It added that the reach of known operations during the 2016 election was “statistically very small compared to overall engagement on political issues”.


In the future, Facebook will continue to study and monitor those who spread misinformation and try to educate those who are at risk about how they can keep their information safe, it said.

It outlined that: “Just as the information ecosystem in which these dynamics are playing out is a shared resource and a set of common spaces, the challenges we address here transcend the Facebook platform and represent a set of shared responsibilities.

“We have made concerted efforts to collaborate with peers both inside the technology sector and in other areas, including governments, journalists and news organisations, and together we will develop the work described here to meet new challenges and make additional advances that protect authentic communication online and support strong, informed, and civically engaged communities.”

Facebook's exhaustive report into fake news strikes a different tone to its CEO and founder's immediate reaction to allegations the social network contributed to President Donald Trump's victory.

Days after the result was announced, Mark Zuckerberg said: "Personally, I think the idea that fake news on Facebook - it's a very small amount of the content - to think it influenced the election in any way is a pretty crazy idea."

Facebook has detailed how fake news infiltrated its platform during the US presidential election campaign, and outlined how it plans to safeguard against similar attempts in the future.