650
submitted 1 year ago by 0x815@feddit.de to c/europe@feddit.de

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

top 50 comments
sorted by: hot top controversial new old
[-] aard@kyu.de 249 points 1 year ago

This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.

At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

[-] alvvayson@lemmy.world 126 points 1 year ago

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

[-] Blapoo@lemmy.ml 56 points 1 year ago

Politics is about to get WILD

[-] JoBo@feddit.uk 37 points 1 year ago

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.

Fucking horrible world we're forced to live in. Where's the fucking exit?

[-] GreatGrapeApe@reddthat.com 18 points 1 year ago

It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.

[-] JoBo@feddit.uk 15 points 1 year ago* (last edited 1 year ago)

I was replying to someone who was claiming they aren't harmful as long as everyone knows they're fake. Maybe nitpick them, not me?

Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.

load more comments (8 replies)
load more comments (3 replies)
[-] taladar@feddit.de 36 points 1 year ago

To verify if something is real, we might have to rely on witness testimony.

This is not going to work. Just because images and videos become less reliable that doesn't mean we will forget about the fact that eyewitness testimony is very unreliable.

[-] Khanzarate@lemmy.world 25 points 1 year ago

You say "forget" like it's not still incredibly common as evidence.

There's lots of data showing that eyewitnesses aren't reliable but that doesn't mean courts actually stopped relying on it. Ai making another form of evidence untrustworthy will result in eyewitnesses taking its place.

[-] hansl@lemmy.world 18 points 1 year ago

A bit off topic, but I wonder if the entertainment industry as a whole is going to be completely destroyed by AI when it gets good enough.

I can totally see myself prompting “a movie about love in the style of Star Wars, with Ryan Gosling and Audrey Hepburn as the leads, directed by Alfred Hitchcock, written by Vincent Hugo.” And then what? It’s game over for any content creation.

Curious if I’ll see that kind of power at home (using open source tools) in my lifetime.

load more comments (8 replies)
load more comments (36 replies)
[-] Seudo@lemmy.world 18 points 1 year ago

Same goes for any deepfake. People are loosing their shit because we won't know what's real and what's not!.

We should have been teaching critical thinking a generation ago. Sagan was pleading for reform in the 90s. We can start teaching the next generation how to navigate the Information Age. What we can't do is make the world childproof.

load more comments (3 replies)
[-] rufus@discuss.tchncs.de 68 points 1 year ago* (last edited 1 year ago)

Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head...

I wonder why they have no address etc on their website and the app isn't available in any of the proper app-stores.

Obviously police should ask Instagram who blackmails all these girls... Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

load more comments (6 replies)
[-] them@lemmy.world 53 points 1 year ago

Yes, lets name the tool in the article so everybody can participate in the abuse

[-] RaivoKulli@sopuli.xyz 31 points 1 year ago

I doubt it will do much of anything not to name it.

[-] DarkThoughts@kbin.social 11 points 1 year ago

Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

[-] RaivoKulli@sopuli.xyz 12 points 1 year ago

Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

load more comments (2 replies)
load more comments (1 replies)
[-] rayyyy@kbin.social 44 points 1 year ago

The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

[-] DessertStorms@kbin.social 59 points 1 year ago

This isn't about nude photos, it's about consent.

[-] andrai@feddit.de 61 points 1 year ago

I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.

[-] DessertStorms@kbin.social 15 points 1 year ago

You're not making the point you think you are, instead you're just outing yourself as a creep. ¯_(ツ)_/¯

[-] andrai@feddit.de 23 points 1 year ago

Hey, you dropped this \

¯\_(ツ)_/¯

load more comments (9 replies)
[-] taladar@feddit.de 36 points 1 year ago

Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.

load more comments (5 replies)
[-] AbaixoDeCao@lemm.ee 34 points 1 year ago

That's really, really sad, EU, please try to regulate AI.

[-] MargotRobbie@lemm.ee 30 points 1 year ago

Banning diffusion models doesn't work, the tech is already out there and you can't put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

This can only be stopped on the distribution side, and any new laws should focus on that.

But the silver lining of this whole thing is that nude scandals for celebs aren't really possible any more if you can just say it's probably a deepfake.

load more comments (4 replies)
[-] negativeyoda@lemmy.world 29 points 1 year ago

Can this come full circle so I can shirtcock it and later say, "dog, that's AI" when people post pictures?

load more comments (2 replies)
[-] Sigmatics@lemmy.ca 27 points 1 year ago* (last edited 1 year ago)

The only thing new about this is that the photos are probably more realistic, but still fake. Apps to do this existed before GenAI was a thing

[-] YurkshireLad@lemmy.ca 27 points 1 year ago

Maybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.

[-] Risk@feddit.uk 19 points 1 year ago

Honestly surprised this didn't happen first.

Be a great way to discredit politicians in homophobic states, by showing a politician taking it up the arse.

[-] Sabata11792@kbin.social 12 points 1 year ago

Its already happened, and there is not enough In the world bleach to unsee it.

load more comments (1 replies)
[-] duxbellorum@lemm.ee 16 points 1 year ago

This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?

[-] 0x815@feddit.de 25 points 1 year ago

These are school girls in their teenage years.To them and their parents, this must be a nightmare.

load more comments (3 replies)
[-] RagnarokOnline@reddthat.com 18 points 1 year ago

I don’t want to band wagon against you, but I do think it’s important that people who agree with your viewpoint have a chance to understand that the situation is a violation of privacy.

The kids’ reputation is, likely, damaged. You have an underage girl who is already dealing with the confusion and hierarchy of high school. Then (A) someone generates semi-accurate photos of what their naked body looks like and (B) distributes it to others.

Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.

Issue (B) is that the generator didn’t stop at the violations of issue (A), but has now shared that material with other people who know the subject without the subject’s consent, and likely without her knowledge of the recipients. This means that the subject now has to perpetually wonder if every person they interact with (friends, teachers, other parents, her own parents) have seen lewd pictures of her. Hopefully you can see how this could disturb a young woman.

Now apply a different situation to it. Suppose you took a test at school or at work that shows you as dumb (like, laughably dumb; enough to make you feel subconscious). Even if you don’t think it’s a fair test, this test exists. Now, assume that someone shared this test with your friends, co-workers, and even your parents without you knowing exactly who received it. And instead of everyone saying “it’s just a dumb test — it doesn’t mean anything”, they decide it means something about you. Every hour or so, you walk by someone or interact with someone who chuckles or cracks a joke at your expense. You’re not allowed by your community to move on from this test.

Before your test was released, you could blend in. Now, you’re the person everyone is looking at and judging. Think of that added anxiety on top of everything else you have to deal with.

load more comments (4 replies)
load more comments (1 replies)
[-] danhab99@programming.dev 14 points 1 year ago

I tried the AI with a pic of me. It was incredibly inaccurate and gave me something between a dick and a vagina. Nothing truly damaging.

load more comments (4 replies)
load more comments
view more: next ›
this post was submitted on 19 Sep 2023
650 points (100.0% liked)

Europe

8332 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

founded 1 year ago
MODERATORS