Imagine waking up to find out someone created fake sexually explicit content of you without your consent. Now imagine it circulating all over the internet for the world to see.

It's a nightmare that's become a reality for more and more women in the era of artificial intelligence and deepfake pornography − and one that's claimed its latest high-profile victim in Taylor Swift.

Women have been sounding the alarm on deepfake porn for years. But now that the biggest pop star in the world has been targeted, is this a turning point in how seriously our society takes this issue? It's possible, experts say. They hope those who've been impacted by deepfakes and enraged Swifties can join forces to pressure lawmakers to further regulate this technology and save future victims, particularly women, from facing the lasting trauma this causes.

"It is demeaning, and it's also a form of cyberbullying, so it can severely impact mental health," psychotherapist Stephanie Sarkis says. "It's not just famous people. This is happening to your normal, everyday person too."

Deepfake porn causes real trauma

Deepfake pornography involves using computer technology, sometimes AI, to generate realistic pornographic content of someone else, usually without their consent.

Those targeted by deepfake porn can experience severe trauma, including anxiety, depression, dissociation, intrusive thoughts, shame, humiliation and suicidal ideation. The psychological fallout can be so debilitating it may leave victims unable to go to work or school or leave the house.

Sarkis says people who create and distribute deepfake porn often do so with the same intention of those who spread revenge porn: It's not just about sexual gratification, but about feeling power and control over the victim.

"It also sends a message of, 'If you are successful, this is what happens to you,' " Sarkis adds of deepfake porn that targets celebrities, like Swift. "Or, 'if you upset someone, this is what happens to you.'"

More:Taylor Swift AI-generated explicit photos just tip of iceberg for threat of deepfakes

It's also an issue that seems to get worse each year as this technology becomes more sophisticated and commonplace.

A September 2019 report by Deeptrace Labs found non-consensual deepfake porn constituted 96% of all deepfake videos online. In October 2023, Wired shared findings from an anonymous independent researcher that found that 113,000 deepfake videos were uploaded to the most popular deepfake porn websites in the first nine months of 2023, a significant uptick from the 73,000 uploaded in all of 2022.

Deepfakes don't just proliferate on porn sites. The images depicting Swift circulated widely on X, formerly known as Twitter, and were viewed millions of times before their removal.

Psychotherapist Jessica Klein says non-consensual image sharing is an issue she's seen more, and more people have to deal with in recent years.

"Seeing an image can really feel like it's actually happened to your body, that there has been a violation of your basic sense of bodily integrity, bodily sovereignty," she says. "Anything that would place one's body or an image of one's body outside the realm of safety and their own control is really putting someone at risk for developing trauma."

Twitter responds:X curbs searches for Taylor Swift following viral sexually explicit AI images

Is the Taylor Swift incident a turning point?

The Swift situation has drawn widespread criticism of deepfake porn, with women coming forward to share their experiences.

"As someone who was also a victim of deepfake porn a few years ago my heart hurts for Taylor so much," wrote one X user. CNN commentator S.E. Cupp wrote a New York Daily News piece about the trauma of having her likeness edited into a pornographic scenario in Hustler over a decade ago. "This one is personal and hard to talk about," Cupp wrote on X. "But it’s too important."

Last year, a Twitch streamer under the name QT Cinderella shared a raw video showing her tearful reaction to discovering deepfake porn made using her likeness.

"I wanted to show this is a big deal," QT, who asked we refer to her using her username for privacy reasons, told USA TODAY last year. "That every single woman on that website, this is how they feel. Stare at me sobbing, and tell me you still think this is OK."

Read her story:She discovered a naked video of herself online, but it wasn't her: The trauma of deepfake porn

Swift herself has yet to publicly comment on the images. But Klein says it's important to remember that, no matter how famous anyone is, trauma "doesn't discriminate."

"She's still going to likely experience some of the sense of violation that any person without celebrity status would experience," she says. "Most survivors would probably hope that it is just another place that awareness around the issue can be raised."

More:Sexually explicit Taylor Swift AI images circulate online, prompt backlash

Disclaimer: The copyright of this article belongs to the original author. Reposting this article is solely for the purpose of information dissemination and does not constitute any investment advice. If there is any infringement, please contact us immediately. We will make corrections or deletions as necessary. Thank you.