Press "Enter" to skip to content

Facebook bans QAnon conspiracy theory accounts across all platforms

Facebook has banned all accounts linked to the QAnon conspiracy theory movement from its platforms.

“Starting today, we will remove Facebook Pages, Groups and Instagram accounts,” the company said on Tuesday.

The move is a significant escalation to Facebook’s earlier decision to remove or restrict groups and accounts sharing and promoting QAnon material.

QAnon is a conspiracy theory that says President Trump is waging a war against elite Satan-worshipping paedophiles.

In a statement released on Tuesday, Facebook said its staff had begun removing content and deleting groups and pages, but that “this work will take time and will continue in the coming days and weeks”.

“Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports,” the statement added.

Facebook said it was updating measures implemented in August, which aimed to “disrupt the ability of QAnon” to organise through – and operate on – its networks.

That policy – introduced to limit the risks to public safety posed by QAnon, “offline anarchist groups” and US-based militia organisations – resulted in restrictions on more than 1,950 Facebook groups and over 10,000 Instagram accounts.


This is a big move from Facebook, which has laid out how it plans to proactively remove all evolving QAnon content from its platforms.

It comes after I asked Facebook’s vice-president of global affairs, Nick Clegg, why the site still allows QAnon to spread political disinformation to US voters and beyond using hashtags like #SaveOurChildren.

Facebook’s first crackdown on this dangerous conspiracy theory focused on violent content plugged by those supporting it, removing a number of groups and pages.

But those supporting QAnon soon adapted, using new palatable hashtags to reach parent groups, local forums and the average Instagram feed. And the movement kept growing.

This latest move will be welcomed – but will also be very hard to enforce, especially since QAnon has become so big and spread under new guises.

I recently spoke to US voters about how QAnon disinformation about candidates and child trafficking rings could already have impacted their friends and neighbours ahead of polling day.

They explained how people they know now believe totally unfounded claims they’ve seen on Instagram and Facebook about the Democrats running a child-trafficking ring or presidential candidate Joe Biden abusing children.

Could this move – like the last – also be too late?


Facebook is not the only social media giant to look at tackling the QAnon conspiracy movement.

In July, Twitter banned thousands of accounts and said it would stop recommending content linked to QAnon in an attempt to help prevent “offline harm”. It also said it would block URLs associated with the group from being shared on the platform.

What is QAnon?

In October 2017, an anonymous user put a series of posts on the message board 4chan. The user signed off as “Q” and claimed to have a level of US security approval known as “Q clearance”.

These messages became known as “Q drops” or “breadcrumbs”, often written in cryptic language peppered with slogans, pledges and pro-Trump themes.

Media playback is unsupported on your device

The amount of traffic to mainstream social networking sites like Facebook, Twitter, Reddit and YouTube has exploded since 2017, and indications are that numbers have increased during the coronavirus pandemic.

Judging by social media, there are hundreds of thousands of people who believe in at least some of the bizarre theories offered up by QAnon.

QAnon followed on from the “pizzagate” saga in 2016 – a fake theory about Democratic Party politicians running a paedophile ring out of a Washington pizza restaurant.

image

Media playback is unsupported on your device

Source: BBC