A Deepfake Porn Bot Is Being Used to Abuse Thousands of Women

Pornographic deepfakes are being weaponized at an alarming scale with at least 104,000 women targeted by a bot operating on the messaging app Telegram since July. The bot is used by thousands of people every month who use it to create nude images of friends and family members, some of whom appear to be under the age of 18.WIRED UKThis story originally appeared on WIRED UK.The still images of nude women are generated by an AI that ‘removes’ items of clothing from a non-nude photo. Every day the bot sends out a gallery of new images to an associated Telegram channel which has almost 25,000 subscribers. The sets of images are frequently viewed more 3,000 times. A separate Telegram channel that promotes the bot has more than 50,000 subscribers.
Some of the images produced by the bot are glitchy but many could pass for genuine. “It is maybe the first time that we are seeing these at a massive scale,” says Giorgio Patrini, CEO and chief scientist at deepfake detection company Sensity, which conducted the research. The company is publicizing its findings in a bid to pressure services hosting the content to remove it but is not publicly naming the Telegram channels involved.The actual number of women targeted by the deepfake bot is likely much higher than 104,000. Sensity was only able to count images shared publicly and the bot gives people the option to generate photos privately. “Most of the interest for the attack is on private individuals,” Patrini says. “The very large majority of those are for people that we cannot even recognize.”
As a result, it is likely very few of the women who have been targeted know that the images exist. The bot and a number of Telegram channels linked to it are primarily Russian-language but also offer English-language translations. In a number of cases the images created appear to contain girls who are under the age of 18, Sensity adds, saying it has no way to verify this but has informed law enforcement of their existence.

Unlike other non-consensual explicit deepfake videos, which have racked up millions of views on porn websites, these images require no technical knowledge to create. The process is automated and can be used by anyone—it’s as simple as uploading an image to any messaging service.

The images are automatically created once people upload a clothed image of the victim to the Telegram bot from their phone or desktop. Sensity’s analysis says the technology only works on images of women. The bot is free to use although it limits people to ten images per day and payments have to be made to remove watermarks from images. A premium version costs around $8 for 112 images, Sensity says.“It's a depressing validation of all the fears that those of us who had heard about this technology brought up at the beginning,” says Mary Anne Franks, a professor of law at the University of Miami. Franks provided some feedback on the Sensity research before it was published but was not involved in the report’s final findings. “Now you've got the even more terrifying reality that it doesn't matter if you've never posed for a photo naked or never shared any kind of intimate data with someone, all they need is a picture of your face.”
The code was quickly backed-up and copied. The DeepNude software uses deep learning and generative adversarial networks to generate what it thinks victims bodies look like. The AI is trained on a set of images of clothed and naked women and is able to synthesize body parts in final images.