Saturday, June 3, 2023

The Big Lie, Con't

YouTube has suddenly discovered that allowing election denial videos stuffed with ads under the aegis of "free speech" is a great way to make money off of the burning of American democracy.
 
YouTube on Friday announced a major change in its approach to US election misinformation, saying it will no longer remove videos that make false claims about the 2020 election or previous presidential elections. Starting today, "we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections," YouTube's announcement said.

This is a reversal from YouTube's announcement in December 2020 that it would ban videos falsely claiming that Donald Trump beat Joe Biden. YouTube said at the time that it "will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 US Presidential election, in line with our approach towards historical US Presidential elections. For example, we will remove videos claiming that a Presidential candidate won the election due to widespread software glitches or counting errors."

The Google subsidiary YouTube made its December 2020 announcement while Trump was spreading a baseless conspiracy theory that the election was stolen from him. Trump's false claims helped fuel the January 6, 2021, attack on the US Capitol.

YouTube today said it "carefully deliberated" before deciding to drop the policy:

We first instituted a provision of our elections misinformation policy focused on the integrity of past US Presidential elections in December 2020, once the states' safe harbor date for certification had passed. Two years, tens of thousands of video removals, and one election cycle later, we recognized it was time to reevaluate the effects of this policy in today's changed landscape. In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.

YouTube's policy against false claims still applies to certain elections in other countries, specifically the 2021 German federal election, and the Brazilian presidential elections in 2014, 2018, and 2022.

YouTube said other policies that reduce spread of misinformation about US elections will not be changed. The platform said it will continue to "disallow content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes."

YouTube also said it will continue to promote "authoritative" content about elections. "We are ensuring that when people come to YouTube looking for news and information about elections, they see content from authoritative sources prominently in search and recommendations," YouTube said.

Like other social networks, YouTube suspended Trump's account after the US Capitol attack. YouTube allowed Trump back on in March of this year, saying, "We carefully evaluated the continued risk of real-world violence, while balancing the chance for voters to hear equally from major national candidates in the run up to an election. This channel will continue to be subject to our policies, just like any other channel on YouTube."
 
 So that's that. 

Looking forward to YouTube removing videos saying that Biden won the election in 2020 after Republicans take over and decide that "authoritative" sources of news are "only what Trump says they are."

No comments:

Related Posts with Thumbnails