Dangers Of Artificial Intelligence: Nude Fakes
**This blog post was updated on April 28, 2021**
What are deep fakes, and how damaging is it to our society? If you haven’t heard of “deep fakes” yet, you probably will soon. We will discuss some of the popular and disturbing deep fake trends, such as nude fakes, fake porn, and other disturbing uses of the technology.
So what are deep fakes? This new development in artificial intelligence uses deep learning techniques to create fake, computer-generated images and videos of real people. It goes far beyond airbrushing, photoshopping, or traditional video editing software. By using real subjects, deep fakes can simulate real human movement and facial expressions. The technology is getting better every day, and it’s already to the point where it can be hard to tell the difference between an authentic video and a deep fake.
Dangers of artificial intelligence and abuse of deep fakes are often discussed, however, there are some good ways it can be used — like creating digital voices for people who can’t speak and updating film footage when actors mess up their lines. But there are also some incredibly worrying trends that this technology makes possible. Deep fakes allow people to make videos that can ruin your reputation, spread lies and misinformation, and even make you the subject of pornographic material.
Deep fakes are still pretty new, but they’re already good enough to do real damage. Parents need to understand how this technology works — and how it can be abused — so they can talk to their kids and help them make responsible decisions about how they use the internet.
The Era of Fake Porn (or "Nude Fakes")
Deep fakes can be used to create porn in several ways. In their earliest form, they were made by superimposing the faces of celebrities onto the bodies of porn actresses so that they looked like real scenes. All they needed to make it convincing is their pictures and videos, which were readily available online. This quickly evolved into using people who the creators know in real life, including neighbors, co-workers, classmates — or even children. This means that everyone is now at risk of being shamed, harassed, humiliated, intimidated, or exploited with faked sexual content that anyone can threaten to publish.
The problem doesn’t end here — even our most modest photographs can be used to create deep fakes. In June 2019, a free app called DeepNudes was released for Windows and Linux. It’s essentially the reverse of the original deep fake process. Users could upload people’s photos into the app, and the A.I. creates a believable nude by replacing their bodies with pornography that already exists. Fortunately, the app was taken down within hours of going live. But the technology is already out there, and deep-faked porn isn’t going anywhere.
Whether or not your child should be watching porn is a conversation worth having on its own, but deep fakes introduce a completely different set of ethical dilemmas. It’s clearly wrong to make porn using unwilling or unwitting participants. Furthermore, the law is lagging far behind the technology behind nude fakes. Legal issues around sexting, consent, privacy, publicity, defamation, copyright, and identity theft are all tangled up, and they won’t be worked out any time soon.
Fake News on Steroids
“Fake News” was so rampant during the last presidential election that it was named 2017’s Word of the Year. The phrase was mostly used to cast doubt on a particular news story, especially by those who didn’t like it. But that was all about second-hand reportage — anchors and articles summarized events and used quotes and videos to back up their claims. But as we approach another major election, there’s a whole new realm of fake news to watch for.
For decades, video has been the benchmark of authenticity on news. With deep fakes, we can now manipulate them to make it look like someone is saying or doing something that they aren’t. These videos can put politicians “on the record” saying terrible or idiotic things, wreaking havoc on their careers and the issues they champion. They can also spread misinformation, contributing to a chaotic political climate and making it even harder to agree on a single reality.
Even adults can struggle with media literacy, so it should come as no surprise that kids are at high risk of taking fake news at face value. As deep fakes become more and more realistic, it’s important to help tweens and teens develop the skills they need to become savvy digital citizens.
Deep Fakes Are a Real Problem
The dangers of artificial intelligence are real, but there are also countless reasons to celebrate technology. But at it progresses, it will continue to challenge us. Deep learning techniques have made undeniable contributions to society, and yet it also enables bad actors to do real harm to real people. It’s up to all of us to do everything we can to learn about issues like this so we can help the next generation navigate the ever-evolving digital landscape.
Teaching media literacy is an excellent place to start, and so is digital citizenship. But even the savviest kids can get over their heads online. Bark can detect sexual content like nude fakes — as well as cyberbullying, depression, sexual predators, threats of violence, and more — in texts, chat, email, YouTube, and more than 30+ social media platforms. Get a free, one-week trial to get alerts for potential issues!