Most of us are well aware of what deepfakes are and the dangers of them and we’ve already been warned that they will play more than a small, and sinister, role in the upcoming 2020 elections.
So, what problems could we expect?
Let’s say that, when election season is in full swing, a deepfake video is released of a candidate spewing our racial slurs. It wouldn’t take long for a video like that to go viral, far quicker than it could ever be proven a fake – the damage is done.
But deepfakery has gone to the next level.
We already have manipulation where faces are altered or manipulated; these already destroy lives but there’s another level – full-body deepfakes!
Yep, it was just a matter of time before it happened!
What possible hellraising could this create?
Imagine this – rather than so-called breaking news about Trump’s much-fabled “Pee Tape”, undoubtedly a fake, we could actually have a deepfaker who recreates a full-body deepfake of the act leak it (not literally, we hope!) into the world!
Full-body deepfakes were in place in 2018 when the University of California Berkeley researchers issued a paper called Everybody Dance Now, showing how deep learning algorithms can put the moves of a professional dancer on the body of a complete amateur.
The paper says:
“Given a source video of a person dancing, we can transfer that performance to a novel (amateur) target after only a few minutes of the target subject performing standard moves. We approach this problem as video-to-video translation using pose as an intermediate representation. To transfer the motion, we extract poses from the source subject and apply the learned pose-to-appearance mapping to generate the target subject.”
It is primitive tech but it just a little bit cool – take a look at the video below and see for yourself:
Really and truthfully, all deepfakes are doing is democratizing what Hollywood has been doing for years using CGI tech. And, to be fair, the tech is pretty amazing but, as far as politics go, it could be a whole different ballgame, especially when the tech advances to the stage where we really can’t tell the difference between a deepfake and the real thing.
We should be thankful that no-one is ever going to use this kind of tech for malicious purposes.