Google-owned YouTube announced that it would begin enforcing against this content on Wednesday, citing Tuesday’s safe harbor deadline for the US election, which is the date after which state election results cannot effectively be challenged. YouTube said that enough states have certified their election results to determine a President-elect. National news outlets have universally projected that Joe Biden will be the next President.

As an example of content that would be banned, YouTube said it would take down videos claiming that a presidential candidate won the election due to widespread software glitches or errors in counting votes.

It will begin enforcing the policy starting Wednesday, and said it would “ramp up” efforts in the weeks to come. It will still allow videos including news coverage and commentary to remain on the platform if they have enough context. Any videos in violation of the policy that were posted prior to Wednesday, will remain up even though they now break YouTube’s rules. They will feature an information panel that says election results have been certified.

When asked why YouTube did not implement these policies ahead of or during the election, a YouTube spokesperson cited Tuesday’s safe harbor deadline as its reasoning.

During the election, YouTube arguably took the least aggressive action on election-related misinformation. For example, a video claiming that President Trump won four more years in office and spouting baseless claims that Democrats are “tossing Republican ballots, harvesting fake ballots, and delaying the results to create confusion” was allowed to remain on the platform. At the time, YouTube said the video did not violate its rules and would not be removed.

YouTube’s election-related policies prohibited content that misleads people about where and how to vote. The company also placed an information panel at the top of search results related to the election, as well as below videos that talk about the election. The information panel linked to both Google’s election results feature and to the Cybersecurity and Infrastructure Security Agency’s “Rumor Control” page for debunking election integrity misinformation. YouTube also worked to promote content from authoritative news sources in search results.

YouTube said its election results information panels have been shown more than 4.5 billion times. Those panels were placed on over 200,000 election-related videos. The company also said that on average 88% of the videos in its top 10 search results related to elections came from authoritative news sources and more than 70% of its recommendations for election-related topics came from authoritative news sources.

During the election and in the aftermath, Twitter (TWTR) labeled and restricted how tweets containing misinformation could be shared, including several from President Trump. Facebook (FB) initially only labeled posts, but later added temporary measures to limit the reach of misleading posts and added hurdles to sharing them.

Since September, the company said it’s taken down more than 8,000 channels and “thousands” of harmful and misleading videos about the election for violating its rules. More than 77% of those videos were removed before they had received 100 views.

Separately, Google also plans to lift its election-related moratorium on political ads beginning Dec. 10, the company informed advertisers on Wednesday. The tech giant said in a letter to advertisers obtained by CNN that Google will lift its “sensitive event” designation for the US election and restore its standard policies surrounding election ads. The moratorium, which was announced ahead of the election, was expected to last for at least a week following Election Day but has gone on for roughly a month.

CNN Business’ Brian Fung contributed reporting

Source