YouTube Defends Choice to Leave Up Election Misinformation Videos
YouTube is defending against claims that its platform is helping promote and spread misinformation surrounding the 2020 US election. The platform said its most popular videos related to the election are from authoritative sources. The company also claims it takes measures to stop the spread of videos containing false or misleading claims through its recommendation engine or by not surfacing them in search results.
YouTube, however, did not specify what it considers authoritative, nor did it break down what percentage of views of election content come from users typing in phrases into the search box instead of following certain channels, seeking out those channels, or finding them via Facebook, Reddit, or other social networks.
Even though its top-10 results for election content may contain mainstream media sources, YouTube does not appear to be acknowledging how often users seek out videos from untrustworthy sources or find them online through other means.
In response to a tweet from Bloomberg journalist Mark Bergen, YouTube wrote from its YouTubeInsider account, “Like other companies, we’re allowing these videos because discussion of election results & the process of counting votes is allowed on YT. These videos are not being surfaced or recommended in any prominent way. The most popular videos about the election are from authoritative news organizations. On average, 88 percent of the videos in top-10 results in the U.S. come from high-auth sources when people search for election-related content.”
YouTube has come under fire in the run-up to and after Election Day for allowing videos from organizations like One America News Network that falsely say President Donald Trump won re-election and that mass voter fraud is responsible for his loss to President-elect Joe Biden.
Unlike Facebook and Twitter, which have been aggressively labeling and removing links and posts that spread false information surrounding the election, YouTube says it allows people to discuss the outcome of the election and processes like vote counting, even if they do so in ways that spread unproven conspiracies or peddle false or misleading claims. YouTube claims it counteracts the spread of such content by limiting how discoverable these videos are using search and its recommendation engine.
However, YouTube appears to be struggling with how to contain the spread of videos uploaded to its platform on other social networks like Facebook, where the videos often go viral too fast before either company is able to slow down their spread.
YouTube says it’s using an election panel pinned to the top of election-related searches pointing users toward its Google webpage with verified election results. It’s also removing advertising from certain videos that undermine “confidence in elections with demonstrably false information,” according to The New York Times, but YouTube is not removing the videos outright.
In an example displaying the process by which YouTube helps amplify misinformation, Vice reported on a false claim alleging that RealClearPolitics had rescinded its Pennsylvania call in favor of Biden, which was then circulated by Trump lawyer Rudy Giuliani until right-wing YouTube channel The Next News Network published a video repeating the claim.