What You Should Know About Tech-Enabled Sexual Violence

On Monday, the U.S. Congress passed the Take It Down Act, a bill that would criminalize the creation and sharing of intimate, nonconsensual deepfakes. The new law “requires online platforms to remove these images within 48 hours of being reported,” among other things, according to a release from the Rape, Abuse & Incest National Network (RAINN).

The law was a bipartisan initiative, first introduced about a year ago; RAINN has been a prominent advocate since then. The bill is now before the President for final approval. There is no word yet on when he will make a final decision on it.

Deepfakes and other forms of online sexual violence, such as revenge porn, are known as “technology-enabled sexual abuse.” It’s not a term you hear often, but it’s likely to be heard more and more, especially with the advent of artificial intelligence.

“Technology-enabled sexual assault is the next frontier in the fight against sexual violence,” Scott Berkowitz, founder and president of RAINN, said in the statement. “I have never seen any form of abuse grow so rapidly in RAINN’s 31 years, and this law is critical to stopping it.”

Below, we explain exactly what technology-enabled sexual assault is, some of the biggest misconceptions about it, and more.
What is technology-enabled sexual assault?

It’s a set of online actions and behaviors that become abuse when “explicit content is created or shared without the individual’s consent,” according to RAINN. The organization also points out that pornography does not fall into this category, as it is something that adults can consent to and can be legally distributed. According to the United Nations Population Fund (formerly the United Nations Fund for Population Activities and still using the acronym UNFPA), there are about 10 different types of technology-enabled sexual violence. Below are some of the most common, including a brief description based on the UNFPA glossary:
Image-based abuse
Sharing intimate photos without someone’s consent.

Sextortion
Blackmailing someone into not publishing their intimate photos.

Revenge porn
The term “pornography” used in this context is a misnomer, as pornography is something that consenting adults can consent to. Revenge porn is a type of image-based abuse, and is more accurately called “non-consensual intimate imagery,” or NCII, as RAINN calls it.

Deepfake
This is one of the most common these days, and involves putting someone else’s face on someone else’s body, usually through explicit acts. This is usually done with some sort of advanced AI face-swapping technology, but it can be done to a lesser extent, and is called “shallowfake.”

What’s a common misconception about tech-induced sexual violence?
According to Jennifer Simmons Kaleba, vice president of communications at RAINN, one of the biggest misconceptions about tech-induced sexual violence is that it can happen to anyone. Many people think it’s something that can and does only happen to celebrities with tens of thousands of photos online, she says, but that’s no longer the case. “Technology has advanced so rapidly that we can apply the same type of swapping technology to everyday people,” she says. Yes, it’s a little scary, but it’s not meant to scare you, but rather to keep you alert and aware. If you find yourself a victim of tech-induced sexual abuse, here’s what you can do:
How is it different from other types of sexual violence?

The obvious difference is that tech-induced sexual abuse isn’t a physical violation of someone’s body; this type of abuse can happen, and “it might not even be your body in the picture,” Simmons Kaleba explains.

But that doesn’t lessen the devastating impacts. “The same feelings that accompany a typical case of sexual abuse—PTSD, shame, anxiety, and depression, to name a few—are applied to this,” she says. “We’re just starting to understand it, as well as how serious or pervasive it is.”