‘IRL Fakes:’ Where People Pay for AI-Generated Porn of Normal People
404media.co
A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
You are viewing a single comment
Well...shit. It seems like any new laws are already too little too late then.
Stable Diffusion has been easily locally installed and runnable on any decent GPU for 2 years at this point.
Combine that with Civitai.com for easy to download and run models of almost anything you can imagine - IP, celebrity, concepts, etc… and the possibilities have been endless.
In fact, with completely free apps like Draw Things on iOS, which allows you to run it on YOUR PHONE locally - where you can download models, tweak, customize, hand it images directly from your mobile device’s library… making this stuff is now trivial on the go.
Tensor processors/AI accelerators have also been a thing on new hardware for a while. Mobile devices have them, Intel/Apple include them with their processors, and it's not uncommon to find them on newer graphics cards.
That would just make it easier compared to needing quite a powerful computer for that kind of task.