YouTube announced on Friday that it will no longer remove videos containing false claims of fraud in the 2020 presidential election. This decision, made ahead of the 2024 elections, is a reversal of the policy implemented after the 2020 vote. The policy change will go into effect from June 2.
The Google-owned platform has faced significant pressure since the 2016 elections to protect against political misinformation. YouTube stated that the new policy is being introduced due to today’s changed landscape and the need to re-evaluate their approach.
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” the company said in a statement.
YouTube has deleted tens of thousands of videos questioning the integrity of past elections but did not provide specific details about what led to the policy change. The BBC has contacted the company for comment.
The platform will continue to enforce other election misinformation policies, such as targeting videos with misleading instructions on where or how to vote.
The election fraud policy, enacted in December 2020, led to the deletion of a video posted by Donald Trump on January 6, 2021, in which he told protesters to leave the US Capitol while also repeating false claims of widespread fraud.
A video posted by a US congressional committee investigating the Capitol riot was deleted in 2022 because it contained a clip of Trump repeating election falsehoods.
YouTube lifted restrictions on Trump’s channel, boasting over 2.7 million subscribers, in March of this year. Since then, the former president has posted approximately 20 short clips in support of his campaign.
The company stated that it will continue to refine its policies in advance of the 2024 election.