You ever feel like we're constantly hurtling towards some kind of Black Mirror **** dark future?
Like we're hell bent on creating the dystopian worlds portrayed in sci-fi movies like Blade Runner.
Or a Neal Stephenson novel.
I get that feeling a lot and it's not just because I'm paranoid.
It's cuz it seems like we're constantly trying to wrap our heads around technology that's just out of our comprehension.
And I really think deep fake videos are the next thing people are going to have a really tough time understanding.
We're entering an era in which our enemies can make it look like anyone is saying [UNKNOWN] at any point in time.
[UNKNOWN], the world is gonna end and we are going to hasten it.
Find thm, show me your leadership capabilities.
So what are deepfakes, you ask?
Well, deepfakes are basically digitally forged videos that make it look like people are doing or saying things that they aren't.
Deep fakes have been a thing for a while now,but people are starting to freak out because they're becoming really easy to make.
Recently, an artist began posting a series of digital works on Facebook and Instagram called the Spectre Project, which feature deep fakes of prominent figures like Mark Zuckerberg and Kim Kardashian.
That specter installation forced Facebook to re examine their policies and fact checking and the restriction of sharing content on their site.
But what's gonna happen in a world where these deep fake videos start to get really good?
Honestly, no one really knows yet.
So what's so creepy about deep thinks is that the good ones mimic human facial behavior in ways that our brains can't perceive.
That's the major concern right now, and for good reason, as we're painfully finding out, a lot of people aren't properly equipped to understand if what they're reading is coming from a legitimate source or not.
So the notion of a world where we're not even sure if we should believe what we see There's a whole new level of scary.
Now if you thought a wave of fabricated new stories on Facebook was a big deal, imagine a campaign of deep fakes spearheading a misinformation agenda.
We're just not ready.
And deep fake software is already out there.
The cat's out of the bag, so the challenge is now creating awareness and education.
It's not all bad though there are some genuine uses for this kind of AI.
Synthetic media can give people a voice who are physically unable to speak or restore speech digitally to those who've lost it.
Think about a deep learning algorithm that's able to translate a video into other languages on the fly.
I mean, there are legitimate reasons to pursue this.
We just need more action.
We need to make the world's leaders pay attention.
The Zuckerberg and Kardashian deep fakes are obviously meant to make a statement.
And like I said, they definitely feel like some kind of dress rehearsal for a world where this kind of video manipulation could be weaponized.
Last year Buzzfeed published a video echoing that cause for concern With a message that appeared like it was President Obama speaking, but in reality was really Jordan Peele doing his spot on impression.
This is a dangerous time.Moving forward, we need to be more vigilant with what we trust from the internet.
Now you can tell something's a little weird with that video.
But it won't always be so easy.
I've watched a lot of deep fakes in the last week and yeah, a lot of them are hot garbage.
But some of them are really good, like scary good.
Check out this one from the YouTube channel Control Shift Phase.
It's a clip from a Bill [UNKNOWN] interview on Conan where he does a bunch of Arnold Schwarzenegger impressions.
I'll say how old are you?
And she'll say four and a half.
Which is her older sister's four and a half.
And I go, no, you're not.
You're not four and a half.
And then she grabs my face and goes, four and a half.
This is a really good example of a face-swapping deep fake.
Probably the most harmless variety, mostly because you can tell right away that something isn't right with what you're seeing, especially if the subjects are prominent figures.
But what about a video of someone you don't recognize because maybe, that person doesn't really exist.
These two people are not real.
They don't exist.
They were created by a machine learning AI project.
From GPU Nvidia.
This technology has exploded since it was developed back in 2014.
It went from really rough black and white images of people's faces to what you see here which is essentially a perfectly rendered phantom human being.
So let your mind run a little bit with the crossover of this technology and deep fakes And yeah, things are about to get real weird.
It genuinely feels like we're gonna reach the point where we can't discern whether or not a video of someone speaking is authentic whether the person or even people in the video even exist.
And I'm not saying this to provoke fear or anything, it's just something we need to be educated about.
So that we have our guard up, and learn to recognize trusted sources.
It's easy to imagine a situation where someone could just brush off accountability and claim a video of them is nothing more than a really good deed fake.
What's wild is we're just at the tip of the iceberg when it comes to the sophistication of deep fake videos.
They're going to get a lot better and they're going to be much more difficult to discern, but there's a race to develop software and technology that can sniff out a fabricated or altered video.
In the wake of those deep fakes that went viral, some companies have announced an effort to combat fake content.
Just recently, Adobe posted That it's developed a method for detecting edits to images that were made using Photoshop's face aware liquify feature.
But it's all a cat and mouse chase for sure.
And if history is any indicator, the sophistication of cutting edge deep fakes will likely have detection software playing catch up.
In a recent story on the subject from the Washington Post A digital forensic expert at the University of California at Barkley told the post straight up, we are outgunned.
The number of people working on the video synthesis side as opposed to the detector side is a hundred to one.
Maybe the answer will come in the form of blockchain.
That's the technology that helps authenticate cryptocurrency transactions.
There's already encouraging ideas out there that might prove useful, using that technology for deep things.
But ultimately, a lot of that responsibility is gonna fall on us right now.
kind of suck.
We love drama and we love being shocked.
It's what we respond to.
It's what engages us.
It's what drives the conversation.
That's unlikely to change.
But the way we look at things on the internet has to.
I know there's still so much to talk about with this topic, and there's a ton I have not touched on.
So you let me know what you think.
And in the meantime, I'm gonna delete every photo of myself off the internet.