New Delhi: YouTube has discreetly loosened its content moderation policies, prioritising "freedom of expression" over strict enforcement of its rules, according to a New York Times report published on June 9, 2025.
The policy shift, which began in December 2024 without public announcement, allows videos that violate YouTube’s Community Guidelines to remain online if they are deemed to be in the "public interest."
This change has sparked debate over the potential spread of misinformation and hate speech, particularly in the context of politically sensitive topics.
The updated guidelines, detailed in internal training materials reviewed by The New York Times, instruct moderators to leave up videos discussing topics such as elections, ideologies, race, gender, sexuality, abortion, immigration, and censorship, even if up to half of the content violates YouTube’s rules against nudity, graphic violence, hate speech, or incendiary misinformation.
Previously, videos with more than a quarter of rule-breaking content were subject to removal. Moderators are now encouraged to consult managers for "borderline" cases rather than removing them outright, with the directive that “freedom of expression value may outweigh harm risk.”
YouTube’s spokesperson, Nicole Bell, told The Verge that the changes aim to prevent the removal of valuable content, such as “an hour-long news podcast” containing a brief clip of violence. The policy builds on a pre-2024 U.S. election adjustment that allowed content from political candidates to stay up under the Educational, Documentary, Scientific, and Artistic (EDSA) exception, even if it violated guidelines. Bell emphasised that these exemptions apply to only a small fraction of YouTube’s content.
The shift aligns YouTube with other tech giants like Meta, which ended its fact-checking program in January 2025, and X, which moved to crowdsourced Community Notes after Elon Musk’s 2022 acquisition. Unlike its peers, YouTube did not publicly disclose the change, drawing criticism for a lack of transparency.
Despite the relaxed guidelines, YouTube reported a 22% increase in content removals in Q1 2025 compared to the previous year, indicating ongoing efforts to address problematic uploads amid rising content volume. The platform faces the challenge of balancing advertiser safety with user expression, as controversial content often drives engagement but risks alienating brands.
The timing, shortly before President Trump’s second term began in January 2025, has fueled speculation about external influences, though YouTube insists the changes reflect evolving content trends and community feedback.