Society is doomed. And this technology is the first step on the way to it. So first off, let’s get this out of the way. Some guy taught a computer how to take one person’s face and stick it onto another person’s body for a video and… well, it’s shockingly convincing. This isn’t a pritt-stick and some Crayola scissors cut-and-paste job. It’s some real future technomagic called DeepFake and it uses neural networks to achieve its result.
And it’s been used for porn.
Like, butt-loads of porn.
Just… so much porn. Daisy Ridley, Emma Watson, Natalie Portman, Alison Brie, Katy Perry, Sophie Turner and many many more have had their faces digitally inserted onto porn stars so it looks like they’re doing all kinds of NSFW things in motion. You can find the subreddit where this stuff is made here, and there’s even an app so you can make your own celebrity porn if you are so inclined.
But, on a slightly different note, this technology is going to be a real problem down the road. As one user on the subreddit commented,
In the world right now is the technology to put one person’s face on another’s body through editing. The quality of these forgeries is incredible and almost indistinguishable from reality. Video is no longer infallible evidence, the possibility to make it seem like celebrities and political figures say and do whatever you want in a recorded way or blackmail people with videos that don’t really exist. And you guys are just whacking it.
This guy raises a really good point. Right now this technology is being used to make the Game of Thrones cast do awful things to each other (wait a minute… isn’t that the showrunners’ job?) but it could get worse. Much worse. Just look at the gif below. The top is the real Rogue One Leia scene, the one below was made using the DeepFake technology that anyone can use.
Anyone anywhere can easily have their face swapped and put on another person's body. It’s like something straight out of a black mirror episode and that is terrifying. Not only will it create scenarios where false videos can be distributed easily to create slander and ruin someone’s image, not only could it result in people trying to use falsified video as alibis or proof of innocence when they’re guilty (and vice versa), but it even means that when legitimate video footage does show a crime being committed that the guilty party can just say “fake”. Until now, that’s not been possible with video, only still photos.
I wonder if the power of misinformation could end up being abused?