![]() "What a perfect tool for somebody seeking to exert power and control over a victim." Adam DodgeĪdvocates also worry about popular deepfake apps that are made for seemingly harmless purposes like face-swapping. In the case of the Telegram bot, Sensity found there had been at least 100,000 victims, including underage girls. The underlying code for “stripping” the clothes off photos of women continues to exist in open-source repositories.Īs a result, the scope of the abuse has grown: now targets are not just celebrities and Instagram influencers but private individuals, says Giorgio Patrini, Sensity’s CEO and chief scientist. ![]() Apps for this express purpose have emerged repeatedly even though they have quickly been banned: there was DeepNude in 2019, for example, and a Telegram bot in 2020. It’s become far too easy to make deepfake nudes of any woman. “What a perfect tool for somebody seeking to exert power and control over a victim,” says Dodge. But for advocates who work closely with domestic violence victims, the development was immediate cause for alarm. After all, fake celebrity porn had been around the internet for years. While the issue gained some public attention, it was mostly for the technology’s novelty. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |