, , ,

Fake News is About to Become Fake Video, and Reality Is In Big Trouble

It’s harder than you’d expect to put your finger on exactly what the problem is with modern discourse, even if it seems obvious on the surface that something has gone terribly wrong. You can point to the proliferation of fake news and that there seems to be no one set of agreed upon facts from which we can all draw our own conclusions, but there have always been liars, and there have always been people who listen to them.

So, perhaps the one saving grace is that, until now, there have still been things that are inarguably true, even if some still try to argue with them — photos, video, or the work of a well-known, trusted journalist. Well, public trust in the media is in tatters, and social media and Photoshop make for a potent combination — pictures can be altered and taken out of context to fit anyone’s aims. Video might have been the last bastion of truth, and now we get to ask what happens when even that can’t be accepted at face value.

Video editing software, audio software, and graphics processors have gotten powerful enough to where anyone committed enough can manufacture reality. By taking pre-existing video and a range of editing tools, it’s now become possible to create videos purporting to show individuals saying and doing things they didn’t actually do.

Naturally, like with all things internet, this started with porn. Motherboard recently reported on the shocking speed of the rise of deepfakes — pornographic videos with the faces of various celebrities edited into the video. In mere months, not only were the edited videos convincing to all but the carefully trained eye, but there were apps and programs making the process easy enough for anyone to make such a video.

Image from the FakeApp website, a tool built to make face swapping in videos easy.

Since that time, social media sites like Reddit and Twitter have banned deepfakes, but we all know how the internet works — those videos will still be made and shared somewhere on the internet, including both of those platforms. Sure, they can take those videos down once they’re detected, but any damage to someone’s reputation will already be done.

Next page: It goes beyond porn and celebrities

1 of 3

Leave a Reply

Your email address will not be published. Required fields are marked *