Science & Tech
Danielle Sinay
Aug 11, 2021
A website pledging to make “men’s dreams come true” is using “state of the art” deepfake technology to “nudify” thousands of unsuspecting women, whose fully-clothed images are uploaded to the site then digitally disrobed without their consent.
The disturbingly convincing “deepfake” tool has gained immense in popularity since its launch last year, especially especially acclaimed in dark, misogynistic corners of the web. According to HuffPost, who first reported the story, the site has amassed over 38 million visitors since the start of 2021.
The service says it has the ability to “nudify”all female bodies, regardless of what they look like., though doesn’t work for cisgender men: All “nudified” images uploaded into the system are given breasts and a vulva. For cis-women, however, this presents an exceedingly dangerous dilemma, as the site supports the very real possibility of counterfeit revenge porn.
#Deepfake pornography is conventional pornography that has been doctored, often by #ArtificialIntelligence technolo… https://t.co/UOc0Soi9Pa— ConsentRevolution (@ConsentRevolution) 1628354609
The site’s AI-generated nudes are eerily realistic, displaying no indications that they’ve been generated by AI. Therefore any angry ex-boyfriend, or stalker, or stranger in a bad mood, can very believably claim they have access to someone’s actual nude photos, so long as they have just one fully clothed photo of their intended victim, and, of course, access to the site.
“Holy mother [of] god,” one fan wrote of the “amazing” site in a deepfake porn forum. “Ive never seen [results like this] before ... the future is now.”
Deepfake porn primarily targets women and girls, and seeing as this site only works on cis-women, will perpetuate the harmful trend. Sensity AI, a research company that tracks deepfake content, found that 90 percent and 95 percent of deep fake videos online are nonconsensual porn. 90 percent of that is specifically nonconsensual porn of women.
"When a sexual deepfake is made of someone, they can be so realistic that you couldn’t say otherwise and then the d… https://t.co/N0UC95ZKdz— End Cyber Abuse (@End Cyber Abuse) 1628265603
“This is a violence-against-women issue,” Adam Dodge, founder of EndTAB, a nonprofit that raising awareness about technology-enabled abuse, told Technology Review.
While the site claims not to save the “nudified” images, it does provide users with shareable links to every doctored photo. Thus, it’s no surprise that the so-called “unsaved” fake nudes have been shared to Twittrer, Facebook, Reddit and other platforms, both privately and publicly. These links also serve as ‘referrals’ for the original posters, earning them rewards for each new user who clicks them, like faster access to “nudity” more photos. (Customarily, users can only disrobe one photo for free every two hours.)
As a result, Redditors have banded together to promote their own — and each other’s — referral links. “Help me and I’ll help you,” one Redditor wrote in a subreddit created for this very purpose.
Deepfake porn can be as devastating in its consequences as revenge porn. Now the law may finally ban it. https://t.co/cxnn01Zz2Z— MIT Technology Review (@MIT Technology Review) 1613212377
“This is a really, really bleak situation,” Henry Ajder, a deepfake expert, told HuffPost. “The realism has improved massively,” he added, noting that deepfake technology is typically weaponized against everyday women, not just celebrities and influencers.”
“The vast majority of people using these [tools] want to target people they know,” he said.
What experts find most shocking is how social media seemingly turned a blind eye to the issue. “If they are a responsible platform and they care about this issue, they definitely should be taking that kind of action,” Mary Anne Franks, a law professor at the University of Miami and president of the Cyber Civil Rights Initiative, said via HuffPost.“It really is incumbent upon them to be proactively looking for threats like this before they become problems. It shouldn’t be individuals or reporters or anyone else pointing this out.”
But social media isn’t cracking down on the deepfakes, and legally, doesn’t necessarily have to. According to the controversial Section 230 of the Communications Decency Act, digital platforms aren’t responsible for user-generated content. But, as these services evolve — and social media continues to provide them with platforms, as there is no legal incentive to do otherwise — women and girls become increasingly less safe online.
“As these apps get more sophisticated, it will be impossible to tell the difference between something that’s manipulated to make you look naked and an actual naked picture,” Franks cautioned.
Apparently, as of right now, the website is well on its way to reaching its highest monthly traffic since its creation.
Top 100
The Conversation (0)