YouTube Details How It Will Tackle Misleading Election Content

YouTube said it would remove any technically manipulated or doctored content, and that may pose a "serious risk of egregious harm."

YouTube Details How It Will Tackle Misleading Election Content

YouTube would terminate channels which impersonate another person or channel

Highlights
  • YouTube detailed how it will tackle false election-related content
  • It would remove any content that has been technically manipulated
  • It does not allow content that aims to mislead people about voting
Advertisement

On the day of the Iowa caucuses, the first nominating contest of the US presidential election, Alphabet's YouTube detailed how it will tackle false or misleading election-related content.

The video-streaming service said in a blog post on Monday that it would remove any content that has been technically manipulated or doctored and may pose a "serious risk of egregious harm."

It also said it does not allow content that aims to mislead people about voting, for instance telling viewers an incorrect voting date, or content that makes false claims related to a candidate's eligibility to run for office.

The blog post also said YouTube would terminate channels which impersonate another person or channel, misrepresent their country of origin or conceal their links with a "government actor."

Social media companies are under pressure to police misinformation on their platforms ahead of the November election.

In January, Facebook said it would remove "deepfakes" and other manipulated videos from its platform, althought it told Reuters that a doctored video of US House Speaker Nancy Pelosi which went viral last year would not meet the policy requirements to be taken down.

Major online platforms have also been scrutinized over their political ad policies. In November, Google, which is also owned by Alphabet, announced it would stop giving advertisers the ability to target election ads using data such as public voter records and general political affiliations.

It now limits audience targeting for election ads to age, gender and general location at a postal code level. Political advertisers also can still contextually target, such as serving ads to people reading about a certain topic.

Google and YouTube also have policies prohibiting certain types of misrepresentation in ads. However, when former Vice President Joe Biden's campaign asked Google to take down a Trump campaign ad that it said contained false claims, a company spokeswoman told Reuters it did not violate the site's policies.

While Twitter has banned political ads including those that reference a political candidate, party, election or legislation, in a push to ensure transparency, Facebook has announced limited changes to its political ad policy.

Facebook, which has drawn criticism for exempting politicians' ads from fact-checking, said it does not want to stifle political speech.

© Thomson Reuters 2020

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: YouTube, US Elections 2020
Facebook Messenger Kids App to Get More Parental Control Features
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »