Artificial intelligence / February 21, 2024

Deepfake nudes take centre stage

Nimmi Kanji

Nimmi Kanji

Director - Social Purpose Programs, For Good and TELUS Wise

A person looking at laptop

If it can happen to Taylor Swift, it can happen to anyone. At the end of January, fake pornographic images and video of Swift spread like wildfire around the Internet.

It took X (formerly Twitter) 17 hours to take down the dupes, and the platform suspended search for anything related to the pop star. However, the damage was already done. The deepfakes had already been viewed tens of millions of times.

Many are now talking about the Taylor Effect (not just reserved for NFL football it seems). Targeting someone of her stature, profile, reach and influence shows just how widespread and threatening the misuse of artificial intelligence (AI) has become.

The undressing phenomenon

So how did these dupes of Taylor end up on the Internet? Someone uploaded pictures or video footage of Taylor to AI “undressing” apps and then generated a nude likeness of her.

In its December 2023 report, A Revealing Picture, social media analysis firm Graphika highlighted the growing popularity and accessibility of these types of AI apps or services. Since September 2023, there were 24 million unique visitors to the 34 websites Graphika reviewed - an incredibly concerning statistic that should raise concern around the globe.

According to Wired, 50 – 80% of people find undressing websites and tools through search. Google search results in Canada, Germany, Japan, Brazil, South Africa and Australia all display a number of sites capable of creating deepfake nudes in a matter of minutes.

It’s not just celebrities at risk

While the Taylor Swift story highlighted just how pervasive and alarming the growth of deepfake nudes is, anyone can become a target. And predominantly, women are affected most.

In December 2023, CBC ran a story about a group of girls from Winnipeg who had deepfake nude photos of them posted online. The girls ranged from grades seven to 12. The source photos were gathered from publicly accessible social media and then altered using AI. It's one of the first cases in Canada where students targeted other students with deepfake nudes.

It’s vital that youth and people in general understand that creating deepfake nudes is an extreme form of cyberbullying which can have a long lasting negative impact on a person's well-being and reputation, even after the content has been verified as a deepfake.

Combatting deepfakes

With the ease of accessing deepfake creation sites and services, how can you mitigate the risk for you and your family?

  • Recognize the sites: many of these providers use social media to advertise and spam comment sections with links to their sites. Be on the lookout. Typically, “undressing” sites and services will have deepfake in their names.
  • Lock down your profiles and limit what you share: many people create deepfake nudes with a photo they were able to easily access online. Keep your social media profiles private and if you are sharing photos, only do so with trusted people. Upload only low-res images and video or limit your content to focus on the things around you (e.g. nature, architecture, pets). It’s important to note that much of the non-consensual, deepfake intimate content is created by people known to their targets (revenge porn for instance).
  • Go back to the basics: it can’t be reiterated enough – good security starts with strong, unique passwords. They are a basic, yet vital, layer of protection. Do not take your passwords for granted. And choose multi-factor authentication whenever you have the option.
  • Talk openly about AI and ethics: AI is relatively new and is changing all the time. Keep having conversations about it, using real examples (I’m sure the kids in your life know about what happened to Taylor Swift). Discuss the risks and rewards, so kids feel confident exploring and experimenting with AI but also know how to use it safely and responsibly.

When used for good, AI is a remarkable tool to push the boundaries of ideas and creativity. But unfortunately, there is a dark side, and we are seeing that with the explosion of undressing applications and deepfake nudes. While this phenomenon may feel like it’s beyond our control, basic digital hygiene can help you protect your likeness from being used maliciously. Nobody is immune. But hopefully the Taylor Effect will help more kids understand why privacy in our digital world and AI ethics are so important.

To learn more about AI with the youth in your life, complete the online TELUS Wise responsible AI workshop and test your knowledge with this quiz.

Tags:
Sextortion
Share this article with your friends:

There is more to explore

Artificial intelligence

Why AI filters can take a toll on our self-esteem

Learn about AI filters and how some of them can impact self-esteem.

Read article

Artificial intelligence

TELUS Wise responsible AI | Validating AI information

Even though information may be presented as facts, it may not represent the whole story. Think critically about AI generated content.

Watch video

Artificial intelligence

TELUS Wise responsible AI | Four steps to evaluating information

Not all information generated by artificial intelligence is true. Use these four steps to separate fact from fiction.

Watch video