104
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

AI-created child sexual abuse images ‘threaten to overwhelm internet’::Internet Watch Foundation finds 3,000 AI-made abuse images breaking UK law

you are viewing a single comment's thread
view the rest of the comments
[-] EatYouWell@lemmy.world 22 points 1 year ago

Honestly, even though I find the idea abhorrent, if it prevents actual children from being abused...

I mean, the content is going to be generated one way or another.

[-] Plopp@lemmy.world 10 points 1 year ago

But the poor algorithms that are forced to generate the content!

[-] HeavyDogFeet@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

Does it? Or is it just bonus content for pedophiles? Just because they’re now getting thing B doesn’t mean they’re not also still getting thing A. In fact, there’s nothing to suggest that this wouldn’t just make things worse. What’s to stop them from simply using it like a sandbox to test out shit they’ve been too timid to do themselves in real life? Little allowances like this are actually a pretty common way for people to build up to committing bolder crimes. It’s a textbook pattern for serial killers, what’s to say it wouldn’t serve the same purpose here?

But hey, if it does result in less child abuse material being created, that’s great. But there’s no evidence that this is actually how it will play out. It’s just wishful thinking because people want to give generative AI the benefit of the doubt that it is a net positive for society.

Anyway, rant over. You might be able to tell that I have strong feelings about benefit and dangers of these tools.

[-] Igloojoe@lemm.ee 16 points 1 year ago

Your argument sounds very similar to when people argue that video games promote violence and criminal activity.

[-] HeavyDogFeet@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

That quite a stretch. For a start, playing video games isn’t illegal. Generating child porn is. Graduating from something innocent to something criminal is very different to starting off at one of the more heinous crimes in modern society and then continuing to do different variations of that same crime.

[-] hahattpro@lemmy.world 2 points 1 year ago

Yeah, you play Teris at kid, then you become series killer with a brick in adult.

Yikep

[-] burliman@lemm.ee 5 points 1 year ago

I’m guessing they could easily support this with a simple premise: Examine a legal fetish, which AI can generate images of, and ask people who generate those images if their consumption of real images have fallen as a result. Also check if actual real life participation in it has been reduced due to the ability to generate the scenarios privately.

It will be skewed if the fetish is legal, since participating won’t land you in jail. But there may be some out there that present other risks besides legal ones to help with that.

[-] Heavybell@lemmy.world 1 points 1 year ago

Where does the fact some people will happily jerk it exclusively to anime titty come into this, I wonder?

[-] Brahminman@iusearchlinux.fyi 1 points 1 year ago

It doesn't? They're a fucking edge case and if you only jerk it to anime then nobody will ever be harmed in the production of the port you watch, so nobody should care or read too much into it

[-] MTK@lemmy.world 4 points 1 year ago

Also the models were trained on real images, every image these tools create are directly related to the rape of thousands or even tens of thousands of children.

Real or not these images came from real children that were raped in the worst ways imaginable

[-] Bye@lemmy.world 4 points 1 year ago

I don’t think that’s the case

[-] MTK@lemmy.world 1 points 1 year ago
[-] Bye@lemmy.world 2 points 1 year ago

You don’t need the exact content you want in order to train a model (Lora) for SD. If you train on naked adults, and clothed kids, it can make some gross shit. And there are a lot more of those safe pictures out there to use for training. I’d bet my left leg that these models were trained that way.

[-] MTK@lemmy.world 1 points 1 year ago

Why? If these people have access to these images why would you bet that they don't use them?

There are dark web sites that have huge sets of CSAM, why would these people not use that? What are you betting on? Their morals?

this post was submitted on 26 Oct 2023
104 points (100.0% liked)

Technology

59299 readers
5248 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS