YouTube to ban content targeting individuals using conspiracy theories

YouTube ban conspiracy theories
A man in the crowd holds a QAnon sign with the group's abbreviation of their rallying cry "Where we go one, we go all" as crowds gather to attend U.S. President Donald Trump's campaign rally at the Las Vegas Convention Center in Las Vegas, Nevada, US, February 21, 2020. REUTERS/Patrick Fallon/Files

(Reuters) – YouTube, the video service of Google parent Alphabet Inc, said it was banning content that targets an individual or a group using conspiracy theories such as QAnon or pizzagate that have “been used to justify real-world violence.”

The company said in a blog post it would begin enforcing these expanded hate and harassment policies immediately and would “ramp up” in the weeks to come.

YouTube’s move follows recent crackdowns announced by other major social media companies, including Facebook Inc and Twitter Inc, against QAnon content. Facebook earlier this month said it would remove any Facebook pages, groups and Instagram accounts “representing” QAnon.

QAnon is an unfounded and sprawling conspiracy theory that claims that US President Donald Trump is secretly fighting a cabal of child-sex predators that includes prominent Democrats, Hollywood elites and “deep state” allies. It was named by the FBI as a potential instigator of domestic terrorism.

The conspiracy theory also borrows some elements from the bogus pizzagate theory about a pedophile ring run out of a Washington, D.C., restaurant.

A YouTube spokesman told Reuters that the recent ban affected content targeting either individuals or protected groups, such as religious or ethnic groups.

YouTube said it has removed tens of thousands of QAnon-related videos and terminated hundreds of QAnon-related channels since updating its hate speech policy in June 2019.

The company’s chief executive officer, Susan Wojcicki, told CNN in an interview this week that many QAnon videos were “borderline content,” which do not violate specific YouTube policies.

The YouTube spokesman said that since January 2019, the platform has been reducing its recommendations of borderline content or videos that could misinform users in harmful ways.

(Reporting by Elizabeth Culliford in Birmingham, England, and Akanksha Rana in Bengaluru; Editing by Rosalba O’Brien and Matthew Lewis)

Related article: Facebook bans political ads that promote voting-fraud conspiracies

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.