Facebook is tightening its policy against QAnon, the baseless conspiracy theory that paints President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and “deep state’ government officials.
The company said Tuesday that it will remove Facebook pages, groups and Instagram accounts for “representing QAnon” — even if they don’t promote violence. Facebook did not immediately explain what it means for Facebook groups to “represent” QAnon.
Less than two months ago, Facebook said it would stop promoting the group and its adherents, although it faltered with spotty enforcement. It said it would only remove QAnon groups if they promote violence. That is no longer the case.
The company said it is starting to enforce the policy as of Tuesday but cautioned that it “will take time and will continue in the coming days and weeks.”
The QAnon phenomenon has sprawled across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos in recent years. QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax.
But the conspiracy theory has also seeped into mainstream politics. Several Republican running for Congress this year are QAnon-friendly.
By the time Facebook and other social media companies began enforcing — however limited — policies against QAnon, critics said it was largely too late.
Reddit, which began banning QAnon groups in 2018, was well ahead, and to date it has largely avoided having a notable QAnon presence on its platform.
Twitter did not immediately respond to a message for comment on Tuesday.