- Head of Instagram Adam Mosseri said in a CBS News interview with Gayle King that Facebook hasn’t taken down a fake video of Mark Zuckerberg saying he’s in control of “billions of people’s stolen data” because the damage had already been done.
- Mosseri said taking it down now would be a “hollow” victory.
- The company is working to figure out a “principled way” rather than making one-off decisions to address the problem with deepfake videos, which could spread misinformation incredibly easily via social media with unnervingly realistic replicas of high-profile individuals.
- Visit Business Insider’s homepage for more stories.
Instagram still hasn’t taken down a fake video of Facebook CEO Mark Zuckerberg because “the damage is done.”
CBS News on Tuesday released an interview with Instagram head Adam Mosseri, where Gayle King asked why Facebook hasn’t yet taken down a CBS video of a computer-manipulated Mark Zuckerberg saying he’s in control of “billions of people’s stolen data.”
CBS News requested on June 12 that Facebook take down the video due to CBS copyright infringement.
But if Facebook took the video down now, Mosseri said, “we could declare victory but that’s not a victory at all, that’s totally hollow.”
Mosseri suggested that it’s too late, since the video had already been up for a significant amount of time.
“If a million people see a video like that in the first 24 hours or the first 48 hours, the damage is done,” Mosseri said.
The fake video posted on Instagram on June 7 had been viewed 2,000 times by June 11, four days later, when Business Insider wrote about the video. At the time of writing this article, the video has over 106,000 views, and has been reported on by countless news outlets.
Fake videos with unnervingly realistic computer-manipulated replicas of high-profile individuals saying or doing controversial things — otherwise known as “deepfakes” — could spread misinformation incredibly easily via social media.
Mosseri said the deep fake problem can’t be addressed by taking down content in one-off decisions when Facebook finds it, when it might be too late because the video spreads too far.
Rather, Facebook is working to figure out a “principled way” to find potentially harmful deepfake videos before they spread.
“I think the most important thing for us to focus on is getting to the content quicker. Once we can do that, then we can have the next debate about whether or not to take it down when we find it,” Mosseri said. “Right now we try to balance safety and speech. And that balance can be tricky.”
King told Mosseri that it’s difficult to see the video knowing that it has been manipulated.
“I struggle with it too. I am a person too,” Mosseri said. “It’s not like I don’t have my own beliefs or I don’t see things that violates or that disagrees with me online or on Facebook or Instagram from time to time.”
SEE ALSO: From porn to ‘Game of Thrones’: How deepfakes and realistic-looking fake videos hit it big
Join the conversation about this story »
NOW WATCH: Face-swapping videos could lead to more ‘fake news’