557
you are viewing a single comment's thread
view the rest of the comments
[-] Surreal@programming.dev 50 points 1 year ago

If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?

[-] sudo22@lemmy.world 51 points 1 year ago

My guess would be he wasn't self hosting the AI network so the requests were going through a website.

[-] Surreal@programming.dev 2 points 1 year ago

The service should have NSFW detection and ban them instantly if they detect it

[-] sudo22@lemmy.world 5 points 1 year ago

ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.

[-] BreakDecks@lemmy.ml 2 points 1 year ago

Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.

I suppose they could get some kind of sex+children detector going for all generated image, but you're going to have to train that model on something, so now it's a chicken and egg problem.

[-] photonic_sorcerer@lemmy.dbzer0.com 14 points 1 year ago* (last edited 1 year ago)

~~He was found extorting little girls with nude pics he generated of them.~~

Edit: So I guess he just generated them. In that case, how'd they become public? I guess this is the problem if you don't read the article.

[-] Missjdub@lemmy.world 35 points 1 year ago

Earlier this month, police in Spain launched an investigation after images of underage girls were altered with AI to remove their clothing and sent around town. In one case, a boy had tried to extort one of the girls using a manipulated image of her naked, the girl’s mother told the television channel Canal Extremadura.

That was another case in Spain. Not the guy in Korea. The person in Korea didn’t distribute the images.

[-] Sir_Kevin@lemmy.dbzer0.com 12 points 1 year ago

Why the fuck isn't that the headline? Jesus, that's really awful and changes everything.

[-] Obonga@feddit.de 18 points 1 year ago

Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.

this post was submitted on 30 Sep 2023
557 points (100.0% liked)

World News

39522 readers
1692 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS