288
Teen boys use AI to make fake nudes of classmates, sparking police probe
(arstechnica.com)
This is a most excellent place for technology news and articles.
The other comment about how this has been happening for a long time (with low tech methods) is true, and it's also true that we can't stop this completely. We can still respond to it:
An immediate and easy focus would be on what they do with the images. Sharing them around is still harassment / bullying and it should be dealt with in the same way as it currently is.
There's also an education aspect to it. In the past, those images (magazines, photocopies, photoshop) would be limited in who sees them. The kids now are likely using free online tools that aren't private or secure, and those images could stick around forever. So it could be good to highlight that
Your point 1 seems to forget something important: kids are often cruel, and bullying is frequently the point. So long term consequences for their classmates can be an incentive more than a deterrent.
Yeah.
What teacher says: You shouldn't do this because it might hurt somebody.
What some kids hear: Check out this new way to hurt somebody and get horny at the same time. And as an added benefit, you can say it's only about the first one if admitting the second one would hurt you socially, even if the second one was the whole original point!
To your first point, much to the benefit of humanity, and counter to popular belief, the internet is NOT forever. Between link rot, data purges, corporate buyouts, transmission compression losses, and general human stupidity, large swaths of the internet have vanished. Hell, just Macromedia selling out to Adobe ended up causing the loss of most of the popular internet games and videos for anyone in their mid to late 30s at this point (you will be misses Flash). The odds of these specific AI-generated child porn pictures surviving even in some dark corner of the bright web are slim to none. And if they end up surviving in the dark web, well, anyone who sees them will likely have a LOT of explaining to do.
Also, for the commentary of the websites keeping the images. That is doubtful, beyond holding them in an account-bound locker for the user to retrieve. They don't care and too many images get generated every day for them to see it as more than reinforcement training.
Speaking of reinforcement training, they may have been able to use Photoshop's new generative fill to do this, but to actually generate fresh images of a specific peer they would have had to train a LoRA or Hypernerwork on photos of the girl so the SD could actually resolve it. They weren't doing that on an AI site, especially not a free one. They were probably using ComfyUI or Automatic1111 (I use both myself). They are free, open source, locally executed software that allow you to use the aforementioned tools when generating. That means that the images were restricted to their local machine, then transferred to a cell phone and distributed to friends.
https://www.theatlantic.com/technology/archive/2021/06/the-internet-is-a-collective-hallucination/619320/
I think we should pressure EU to make it such that any online AI photo generating website also uses AI to make sure what was asked is not illegal.