98% of deepfakes are porn
AI image generation has become outrageously good in the past 12 months … and some people (mostly men) are increasingly using the tech to create homemade deepfake porn of people they fantasize about using pics culled from social media.
The subjects hate it, of course, and the practice has been banned in the United Kingdom. However, there is no federal law that outlaws creating deepfakes without consent in the United States.
The Nudify app (Nudify.online)
Face-swapping mobile apps like Reface make it simple to graft a picture of someone’s face onto existing porn images and videos. AI tools like DeepNude and Nudeify create a realistic rendering of what the AI tool thinks someone looks like nude. The NSFW AI art generator can even crank out Anime porn deepfakes for $9.99 a month.
According to social network analytics company Graphika, there were 24 million visits to this genre of websites in September alone. “You can create something that actually looks realistic,” analyst Santiago Lakatos explains.
Such apps and sites are mainly advertised on social media platforms, which are slowly starting to take action, too. Reddit has a prohibition on nonconsensual sharing of faked explicit images and has banned several domains, while TikTok and Meta have banned searches for keywords relating to “undress.”
Around 98% of all deepfake vids are porn, according to a report by Home Security Heroes. We can’t show you any of them, so here’s one of Biden, Boris Johnson and Macro krumping.
“AI will destroy the world”
Meantime AI: pic.twitter.com/Rz2UAuOQQS
— Enzo Avigo (@0zne) December 11, 2023
The technology and celebrity-obsessed South Koreans lead the trend, accounting for 53% of all deepfake porn on the web.
K-pop singers (58%) and actresses from South Korea (33%) make up the overwhelming number of targets, with one singer the subject of 1,595 videos that have been viewed more than 5.5 million times.
A survey of 1,522 American men found that while 68%…
..