The consequences of making a nonconsensual deepfake

Making a nonconsensual deepfake could lead to legal and financial repercussions.
By Rebecca Ruiz  on 
Illustration of teen boy seemingly in the act of making a deepfake image, surrounded by authority figures.
The consequences of making a nonconsensual deepfake image can be life-changing. Credit: Zain bin Awais / Mashable

Lawyer Sean Smith has seen up close how nonconsensual deepfakes, a form of image-based sexual abuse, can ruin lives.

Smith, a family law attorney with the Roseland, New Jersey, firm Brach Eichler, recently represented both the families of minor victims and perpetrators throughout educational disciplinary proceedings.

His clients have included teen girls whose images were taken from social media, then digitally "undressed" by their male classmates, who used software powered by artificial intelligence.

The apps and websites capable of creating explicit nonconsensual deepfakes typically market themselves as satisfying a curiosity or providing entertainment. As a result, users likely don't understand that the resulting imagery can inflict painful, lifelong trauma on the person whose likeness has been stolen — who is almost always a girl or woman. The victim may never be able to remove every synthetic photo or video from the internet, given how difficult it is to track and delete such content.

This can lead to professional, personal, and financial devastation for survivors. The same can be true for perpetrators when their name and reputation is associated with creating nonconsensual deepfakes. They may face suspension or expulsion if they're a student, and also face criminal and civil penalties, depending on where they live.

"It destroys lives on every side," Smith told Mashable.

This typically isn't made clear to youth and adult users who engage in image-based sexual abuse.

Is it illegal to make a deepfake?

Despite the absence of information about the consequences of nonconsensual deepfakes, their rise has prompted several states to pass legislation criminalizing them.

Meanwhile, Congress has introduced but has yet to vote on a bill that would give victims the right to file a civil suit against perpetrators. A separate federal bill would criminalize the publication of nonconsensual intimate imagery, including that created by AI, and require social media companies to remove that content at a victim's request.

In some states, offenders can face civil penalties should the victim successfully sue them for damages. Their wages may be garnished or their property seized to pay for such damages.

Last year, Illinois amended an existing law in order to make deepfake offenders liable when they distribute nonconsensual synthetic images. A survivor can sue the person who disseminates the content for damages, which may result from emotional distress, the cost of mental health treatment, the loss of a job, and other related costs.

"When the laws get enforced, it's going to be a black mark that will follow a person for a very long time..."
- Matthew B. Kugler, professor of law, Northwestern University

In New York, dissemination of nonconsensual deepfakes can lead to a year spent in jail, a fine, and a civil suit. Florida imposes both criminal and civil penalties for the "promotion" of nonconsensual synthetic material. The state's law also expanded the definition of "child pornography" to include deepfakes of minors engaged in sexual conduct.

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

Indiana, Texas, and Virginia are among the states that have made the creation of nonconsensual deepfakes punishable by jail time.

Many states, however, don't yet have laws that make the creation or distribution of deepfakes illegal, or give victims the right to sue. Additionally, it may be difficult for victims to pursue criminal or civil penalties against the person who promoted the content because their identity is unknown, or because law enforcement is understaffed to investigate potential crimes.

But Matthew B. Kugler, professor of law at Northwestern University, says that shouldn't give people a false sense of security.

"When the laws get enforced, it's going to be a black mark that will follow a person for a very long time, and no one's going to feel bad about the fact that that black mark follows [the offender] for a very long time," Kugler says.

In 2020, Kugler studied public attitudes toward sexually explicit, nonconsensual deepfake videos in a survey of 1,141 U.S. adults. The vast majority of the respondents wanted to criminalize the act.

There is another potential legal consequence to creating nonconsensual deepfake imagery, regardless of whether the offender's state imposes criminal or civil penalties.

Adam Dodge, a lawyer and founder of Ending Tech-Enabled Abuse (EndTAB), says that a victim can file for a protective or restraining order if she knows who's responsible for the creation or distribution of the imagery. In many jurisdictions, image-based abuse qualifies as a form of harassment.

Such restraining orders are discoverable in background searches conducted by potential employers, Dodge says. A restraining order can also be applied to a youth offender. Though a minor's legal record is meant to be sealed, Dodge has seen instances where the information becomes public.

What happens to minors who create or share a nonconsensual deepfake

Teens who find deepfake apps or sites, either through word of mouth or ruthless internet marketing and search strategies, often don't grasp the potential fallout for victims or themselves, says Smith.

He notes that because the phenomenon is so new, school-based discipline can vary widely. At public schools, which are legally obligated to keep students enrolled to the extent that it's possible, the punishment can vary from brief in- or out-of-school suspensions.

But Smith says that private schools, with their own codes of conduct, may quickly escalate to expulsion.

The victim's parents may also pursue legal action in an effort to hold the perpetrator and their family accountable. Though Smith hasn't seen such a case yet, he expects some parents to begin filing civil lawsuits against a perpetrator's parents on the grounds of negligent supervision. Any damages won could potentially be covered by homeowner's insurance, unless the parents' carrier restricts such claims.

Teens could also be subject to criminal penalties, including those related to child pornography and other criminal statutes. Smith is aware of juvenile proceedings against teens who've created nonconsensual deepfakes. Though they did not serve time in jail, the offenders entered into a private agreement with the state as culpability for their actions.

In Florida, however, two teens were arrested and charged with felonies last December for disseminating nonconsensual deepfakes.

Smith says that parents and teens urgently need to understand these and other consequences.

"The problem with this technology is that the parents and the kids don't realize how big a mistake the use of the technology is," Smith says. "How just the introduction of the technology onto a cellphone…can create this much larger lifetime mistake."

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


Recommended For You
TikTok is (still) obsessed with exposing cheating. But are internet sleuths going too far?
Hand drawn doodles and textures depicting mass surveillance and thin line between privacy and security.

We tried every Amazon Kindle to find the best for every bibliophile — just in time for Prime Day
kindle kids, kindle scribe, and kindle oasis with blue and purple background

EU chat control law would allow scans of encrypted messages
Shattered glass with padlock icon and EU flag

Crypto scam victims are being scammed double by fake law firms, FBI warns
Money and Bitcoin wallet

Redbox's owner files for bankruptcy
A physical Redbox outside a grocery store.

More in Life

Samsung Galaxy deals are plentiful ahead of Prime Day
woman using S Pen with Samsung Galaxy Tab S9

Microsoft made an AI voice so real, it's too dangerous to release
Microsoft logo on building

Apple issues yet another 'spyware' iPhone warning to users in nearly 100 countries
iPhone 15


Trending on Mashable
NYT Connections today: See hints and answers for July 11
A phone displaying the New York Times game 'Connections.'

'Wordle' today: Here's the answer hints for July 11
a phone displaying Wordle


NYT's The Mini crossword answers for July 11
Closeup view of crossword puzzle clues

Webb telescope may have just revealed an alien world with air
A super-Earth orbiting a red dwarf star
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!