Big tech companies like Google, Apple, and Discord are making it easy for people to sign up for harmful “undressing” websites that use AI to remove clothing from real photos, making victims appear “naked” without their consent. There are more than a dozen such deepfake websites, which have been using the tech companies’ login buttons for months.
WIRED’s analysis found 16 of the biggest so-called strip-down and nudity sites to be using the sign-in infrastructure of Google, Apple, Discord, Twitter, Patreon and Line. The approach allows people to easily create accounts on deepfake sites, lending them a superficial sense of credibility, before paying credits to generate their images.
Bots and websites that create non-consensual intimate images of women and girls have existed for years, but the introduction of generative AI has increased their numbers. This kind of “nude” abuse is alarmingly widespread, with teenage boys allegedly creating images of their classmates. Critics say tech companies have been slow to respond to the scale of the problem, with websites appearing high in search results, promoted with paid advertising on social media, and apps appearing in app stores.
“This is a continuation of the trend of big tech companies normalizing sexual violence against women and girls,” said Adam Dodge, attorney and founder of EndTAB (Ending Technology-Enabled Abuse). “Sign-in APIs are a tool of convenience. Sexual violence shouldn’t be an act of convenience,” he said. “Instead of building walls around access to these apps, they’re giving people a drawbridge.”
The sign-in tools analyzed by WIRED were deployed through APIs and common authentication methods, allowing people to join deepfake websites using their existing accounts. Google’s login system appeared on 16 websites, Discord on 13, and Apple on six. X buttons appeared on three websites, and Patreon and messaging service Line buttons appeared on the same two websites.
WIRED is not naming these websites because they allow for abuse; some are part of broader networks and owned by the same people or companies. The login systems have been exploited despite tech companies having broad rules that say developers shouldn’t use their services in ways that allow harm, harassment or invasion of privacy.
Contacted by WIRED, spokespeople for Discord and Apple said they removed developer accounts linked to their websites. Google said it would take action against developers if it found violations of its terms. Patreon said it was banning accounts that allowed the creation of explicit images, and Line confirmed it was investigating but said it couldn’t comment on specific websites. X did not respond to a request for comment about how it uses the system.
Hours after Discord’s vice president of trust and safety, Judd Hoffman, told WIRED it had suspended access to the websites’ API for violating its developer policies, one of the stripping websites posted to its Telegram channel that authentication through Discord was “temporarily unavailable” and claimed it was working to restore access. The stripping service did not respond to WIRED’s request for comment on its operations.
Rapid expansion
The creation of non-consensual sexual videos and images has increased exponentially since the emergence of deepfake technology in late 2017. While videos have become harder to produce, the creation of images using “undressing” and “nudity” websites and apps has become commonplace.
“We need to be clear that this is not innovation, this is sexual abuse,” said San Francisco City Attorney David Chiu, who recently filed a lawsuit against nude and nudity websites and their creators. Chiu said the 16 websites his office is suing received about 200 million visits in the first half of this year alone. “These websites are perpetrating horrific exploitation of women and girls around the world. These images are used to bully, humiliate and blackmail women and girls,” Chiu argued.