What is QAnon and how are social media sites handling it?
Social media companies Facebook and Twitter have announced crackdowns on content linked with the unfounded and sprawling conspiracy theory QAnon.
WHAT IS QANON?
QAnon followers espouse an intertwined series of beliefs, based on anonymous web postings from “Q,” who claims to have insider knowledge of the Trump administration.
A core tenet of the conspiracy theory is that US President Donald Trump is secretly fighting a cabal of child-sex predators that includes prominent Democrats, Hollywood elites, and “deep state” allies.
QAnon, which borrows some elements from the bogus “pizzagate” theory about a pedophile ring run out of a Washington restaurant, has become a “big tent” conspiracy theory encompassing misinformation about topics ranging from alien landings to vaccine safety.
Followers of QAnon say a so-called Great Awakening is coming to bring salvation.
HOW HAS IT SPREAD ONLINE?
The ‘Q’ posts, which started in 2017 on the message board 4chan, are now posted on 8kun, a rebranded version of the shuttered web board 8chan. QAnon has been amplified on Twitter, Facebook, Instagram and YouTube, the video streaming service of Alphabet’s Google.
Media investigations have shown that social media recommendation algorithms can drive people who show an interest in conspiracy theories towards more material.
Earth flying through ancient supernovae's dust
A report by the Institute for Strategic Dialogue (ISD) found that the number of users engaging in discussion of QAnon on Twitter and Facebook has surged this year, with membership of QAnon groups on Facebook growing 120 percent in March.
Researchers say that Russian government-supported organizations are playing a small but increasing role amplifying the conspiracy theories.
QAnon backers helped to organize real-life protests against child trafficking in August and were involved in a pro-police demonstration in Portland, Oregon.
QAnon also looks poised to gain a toehold in the US House of Representatives, with at least one Republican candidate who espouses its beliefs on track to win in the November elections.
WHAT ARE SOCIAL PLATFORMS DOING ABOUT IT?
Twitter in July said it would stop recommending QAnon content and accounts in a crackdown it expected would affect about 150,000 accounts. It also said it would block QAnon URLs and permanently suspend QAnon accounts coordinating abuse or violating its rules.
Facebook in August removed nearly 800 QAnon groups for posts celebrating violence, showing intent to use weapons or attracting followers with patterns of violent behavior. It has also imposed restrictions on the remaining 1,950 public and private QAnon groups that it found. Facebook said it plans to ban ads that promote or reference QAnon, and it does not allow QAnon pages to run commerce shops.
Fitbit unveils stress-tracking smartwatch with Google deal pending
A spokeswoman for the short-form video app TikTok said QAnon content “frequently contains disinformation and hate speech” and that it has blocked dozens of QAnon hashtags.
A Reddit spokeswoman told Reuters the site has removed QAnon communities that repeatedly violated its rules since 2018 when it took down forums such as r/greatawakening.
A YouTube spokeswoman said it has removed tens of thousands of Q-related videos and terminated hundreds of Q-related channels for violating its rules since updating its hate speech policy in June 2019.
YouTube also said it reduces its recommendations of certain QAnon videos that “could misinform users in harmful ways.” It does not have a specific ban on monetizing QAnon content. ISD researchers found that about 20 percent of all QAnon-related Facebook posts contained YouTube links.