Facebook has removed more than 500 pages and 250 accounts it says have repeatedly broken rules around spam and “co-ordinated inauthentic behaviour”.
The social network said it had clamped down on large-scale spam postings designed to artificially inflate likes and shares on misleading content which lures users to fake websites designed to generate advertising revenue through clicks.
Facebook said its action came with next month’s mid-term elections in the US in mind, and the ongoing debate around the influence of the internet and social media on elections.
The site said spam networks were increasingly using “sensational political content” to generate attention on the platform as the elections approached.
Writing on a blog post about the removed pages and accounts, Facebook’s head of cybersecurity policy, Nathaniel Gleicher, and product manager Oscar Rodriguez said the focus of Facebook’s rule enforcement was around behaviour, rather than content.
“Topics like natural disasters or celebrity gossip have been popular ways to generate clickbait. But today, these networks increasingly use sensational political content – regardless of its political slant – to build an audience and drive traffic to their websites, earning money for every visitor to the site,” they said.
“And like the politically motivated activity we’ve seen, the ‘news’ stories or opinions these accounts and pages share are often indistinguishable from legitimate political debate.
“This is why it’s so important we look at these actors’ behaviour – such as whether they’re using fake accounts or repeatedly posting spam – rather than their content when deciding which of these accounts, pages or groups to remove.”
In total, Facebook said 559 pages and 251 accounts had been removed in the latest crackdown, many because they were judged to be fake accounts or multiple accounts with the same name that posted massive amounts of content across a network of groups and pages looking to drive traffic to their websites.
“Many used the same techniques to make their content appear more popular on Facebook than it really was. Others were ad farms using Facebook to mislead people into thinking that they were forums for legitimate political debate,” Facebook said.
The social network acknowledged that legitimate groups and organisations on the site also use co-ordinated techniques to raise awareness, but it would continue to work to find and remove those who misused the platform.
Of course, there are legitimate reasons that accounts and pages co-ordinate with each other — it’s the bedrock of fundraising campaigns and grassroots organisations,” the company said.
“But the difference is that these groups are upfront about who they are, and what they’re up to.
“As we get better at uncovering this kind of abuse, the people behind it — whether economically or politically motivated — will change their tactics to evade detection.
“It’s why we continue to invest heavily, including in better technology, to prevent this kind of misuse. Because people will only share on Facebook if they feel safe and trust the connections they make here.”