Facebook bans deepfake videos in the lead up to the 2020 U.S. election

The world’s largest social media network finally flexes it’s power to make US elections safer

Last year, Facebook announced measures to protect the 2020 U.S. election from foreign influence and misleading information last year.

Now, in addition to those, the firm has announced that it is banning deepfakes – manipulated photos and videos, from its platforms, a move aimed to curb misinformation ahead of the U.S. presidential election later this year.

This was announced in a blog post by Monika Bickert, their Vice President of Global Policy Management, stated that “misleading manipulated media” would be removed if it meets the following criteria:

It has been edited or synthesized – beyond adjustments for clarity or quality – in ways that aren’t apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say. And:

It is the product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic.

Facebook did reveal that “this policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words.”

Any manipulated content that does not meet the requirements for removal will still be eligible for review by Facebook’s independent third-party fact-checkers, who may rate it false or partly false.

Then, its distribution in News Feed will be significantly reduced and it will not be allowed to run as an ad. People who have already shared the content will be shown “warnings alerting them that it’s false.”

Written by Onuora Amobi


Leave a Reply

Your email address will not be published. Required fields are marked *


rick morty deepfake

Love Rick and Morty? This Deepfake Will Leave You Cold


Walmart Expands Robotic Workforce To 650 More Stores