As disinformation and hate thrive online, YouTube quietly changed how it moderates content

3 week_ago 25

Entertainment

YouTube, the world's largest video platform, with 20 cardinal videos uploaded each day, appears to person changed its moderation policies to let much contented that violates its ain rules to stay online.

Change allows much contented that violates guidelines to stay connected level if determined to successful nationalist interest

Nick Logan · CBC News

· Posted: Jun 14, 2025 4:00 AM EDT | Last Updated: 8 hours ago

A reddish  "play" icon adjacent  to the connection     "YouTube" implicit    a blurred inheritance  of the YouTube video grid.

YouTube softly changed its moderation policies precocious past year, according to a caller study from The New York Times, allowing much contented that violates its ain guidelines to stay connected the world's astir fashionable video platform. (Wendy Martinez/CBC)

YouTube, the world's largest video platform, appears to person changed its moderation policies to let much contented that violates its ain rules to stay online.

The alteration happened softly successful December, according to The New York Times, which reviewed grooming documents for moderators indicating that a video could enactment online if the offending worldly did not relationship for much than 50 per cent of the video's duration — that's treble what it was anterior to the caller guidelines.  

YouTube, which sees 20 cardinal videos uploaded a day, says it updates its guidance regularly and that it has a "long-standing signifier of applying exceptions" erstwhile it suits the nationalist involvement oregon erstwhile thing is presented successful an educational, documentary, technological oregon creator context.

"These exceptions use to a tiny fraction of the videos connected YouTube, but are captious for ensuring important contented remains available," YouTube spokesperson Nicole Bell said successful a connection to CBC News this week.

But successful a clip erstwhile societal media platforms are awash with misinformation and conspiracy theories, determination are concerns that YouTube is lone opening the doorway for much radical to spread problematic oregon harmful contented — and to marque a nett doing so. 

YouTube isn't alone. Meta, which owns Facebook and Instagram, dialled backmost its contented moderation earlier this year, and Elon Musk sacked Twitter's moderators erstwhile helium purchased the level successful 2022 and rebranded it arsenic X. 

"We're seeing a contention to the bottommost now," Imran Ahmed, CEO for the U.S.-based Center for Countering Digital Hate, told CBC News. "What we're going to spot is simply a maturation successful the system astir hatred and disinformation."

WATCH | Experts pass Meta's moderation move volition apt summation misinformation: 

Meta’s determination distant from information checking could person unsafe consequences, experts warn

Facebook and Instagram’s parent-company Meta is getting escaped of information checkers connected the platforms and volition alternatively trust connected users to remark connected accuracy, but experts pass the determination volition apt summation the dispersed of misinformation.

Public involvement vs nationalist harm

YouTube's extremity is "to support escaped expression," Brooks said successful her statement, explaining that easing its assemblage guidelines "reflect the caller types of content" connected the platform. 

For example, she said, a long-form podcast containing one abbreviated clip of unit whitethorn nary longer request to be removed. 

The Times reported Monday that examples presented to YouTube staff included a video successful which idiosyncratic utilized a derogatory word for transgender people during a treatment astir hearings for U.S. President Donald Trump's furniture appointees, and different that shared mendacious accusation astir COVID-19 vaccines but that did not outright archer radical not to get vaccinated.

A level similar YouTube does person to marque immoderate "genuinely precise hard decisions" erstwhile moderating content, says Matt Hatfield, enforcement manager of the Canadian integer rights radical OpenMedia.

LISTEN |  How Canada has travel to play an outsized relation successful far-right misinformation:

Day 69:42What's down Canada's outsized power successful the satellite of far-right misinformation

 

He believes platforms bash instrumentality the issue seriously, but helium says there's a equilibrium betwixt removing harmful oregon amerciable content, specified arsenic kid maltreatment worldly oregon wide incitements to violence, and allowing contented to enactment online, even if it's violative to many or contains immoderate mendacious information. 

The problem, he says, is that societal media platforms besides "create environments that promote immoderate atrocious behaviour" among creators, who similar to locomotion the enactment of what's acceptable.

"The halfway exemplary of these platforms is to support you clicking, support you watching, get you to effort a video from idiosyncratic you've ne'er experienced earlier and past instrumentality with that person."

And that's what concerns Ahmed.

He says these companies put profits implicit online information and that they don't look consequences because there are nary regulations forcing them to bounds what tin beryllium posted connected their platforms.

He believes YouTube's relaxed policies will lone promote much radical to exploit them.

How good YouTube is moderating 

In a caller transparency report, YouTube said it had removed astir 2.9 cardinal channels containing much than 47 cardinal videos for assemblage guideline violations successful the archetypal quarter — that came after the reported argumentation change. 

The overwhelming bulk of those, 81.8 per cent, were considered spam, but different reasons included violence, hateful oregon abusive worldly and kid safety.

LISTEN | Why you're being tormented by ads algorithms and AI slop:

Front Burner38:12The net sucks now, and it happened connected purpose

 

Hatfield says determination is simply a nationalist involvement successful having harmful contented similar that removed, but that doesn't mean each arguable oregon violative contented indispensable go.

However, helium says YouTube does marque mistakes successful contented moderation, explaining that it judges idiosyncratic videos successful a sort of "vacuum" without considering however each portion of content fits into a broader context.

"Some contented can't truly beryllium reasonably interpreted successful that way."

Regulations not a cleanable solution

Ahmed says companies should beryllium held accountable for the contented connected their platforms done authorities regulation.

He pointed to Canada's arguable but now-scuttled Online Harms Act, besides known arsenic Bill C-63, arsenic an example. It projected heavier sentences, caller regulatory bodies and changes to a fig of laws to tackle online abuse. The measure died erstwhile erstwhile premier curate Justin Trudeau announced his resignation and prorogued Parliament backmost successful January. Ahmed says helium hopes the caller government under Prime Minister Mark Carney will enact akin legislation. 

Hatfield says helium liked parts of that act, but his radical yet opposed it after it tacked connected immoderate different changes to the Criminal Code and Human Rights Act that helium says were unrelated to the platforms.

He says groups similar OpenMedia would person liked to spot a strategy addressing concern models that promote users to station and nett disconnected of "lawful but awful" content. 

"We're not going to person a hate-free internet," helium said. "We tin person an net that makes it little profitable to dispersed definite types of hatred and misinformation." 

WATCH | How radical tin go much discerning quality consumers:

How bash you cognize if the quality you devour is true?

Canada Research Chair Timothy Caulfield's caller book, The Certainty Illusion: What You Don’t Know and Why It Matters, explores however radical tin go much discerning and captious quality consumers. He says the quality to recognize wherever accusation comes from and being capable to larn and alteration opinions has ne'er been much important.

ABOUT THE AUTHOR

Nick Logan is simply a elder writer with CBC based successful Vancouver. He is simply a multi-platform newsman and producer, with a peculiar absorption connected planetary news. You tin scope retired to him astatine [email protected].

    With files from Darren Major

    read-entire-article