YouTube details how it will tackle misleading election content

Telling viewers an incorrect voting date, or content that makes false claims related to a candidate’s eligibility


Reuters February 04, 2020
People are silhouetted as they pose with mobile devices in front of a screen projected with a Youtube logo, in this picture illustration taken in Zenica October 29, 2014. PHOTO: REUTERS

On the day of the Iowa caucuses, the first nominating contest of the US presidential election, Alphabet’s YouTube detailed how it will tackle false or misleading election-related content.

The video-streaming service said in a blog https://youtube.googleblog.com post on Monday that it would remove any content that has been technically manipulated or doctored and may pose a “serious risk of egregious harm.”

It also said it does not allow content that aims to mislead people about voting, for instance telling viewers an incorrect voting date, or content that makes false claims related to a candidate’s eligibility to run for office.

The blog post also said YouTube would terminate channels that impersonate another person or channel, misrepresent their country of origin or conceal their links with a “government actor.”

Social media companies are under pressure to police misinformation on their platforms ahead of the November election.

YouTube makes user experience more private

In January, Facebook Inc said it would remove “deepfakes” and other manipulated videos from its platform, although it told Reuters that a doctored video of US House Speaker Nancy Pelosi which went viral last year would not meet the policy requirements to be taken down.

Major online platforms have also been scrutinized over their political ad policies. In November, Google, which is also owned by Alphabet, announced it would stop giving advertisers the ability to target election ads using data such as public voter records and general political affiliations.

It now limits audience targeting for election ads to age, gender and the general location at a postal code level. Political advertisers also can still contextually target, such as serving ads to people reading about a certain topic.

YouTube penalty is $170 million for collecting, sharing data from kids

Google and YouTube also have policies prohibiting certain types of misrepresentation in ads. However, when former Vice President Joe Biden’s campaign asked Google to take down a Trump campaign ad that it said contained false claims, a company spokeswoman told Reuters it did not violate the site’s policies.

While Twitter Inc has banned political ads including those that reference a political candidate, party, election or legislation, in a push to ensure transparency, Facebook has announced limited changes to its political ad policy.

Facebook, which has drawn criticism for exempting politicians’ ads from fact-checking, said it does not want to stifle political speech.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ