650
submitted 2 years ago by 0x815@feddit.de to c/europe@feddit.de

Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

you are viewing a single comment's thread
view the rest of the comments
[-] aard@kyu.de 249 points 2 years ago

This was just a matter of time - and there isn't really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that'll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

So in the long term we'll see that shift to images generated at home, by kids often too young to be prosecuted - and you won't be able to stop that unless you start outlawing most of AI image generation tools.

At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

[-] alvvayson@lemmy.world 126 points 2 years ago

There's one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying "they're AI generated" is becoming a plausible way out.

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

[-] Blapoo@lemmy.ml 56 points 2 years ago

Politics is about to get WILD

[-] JoBo@feddit.uk 37 points 2 years ago

Indeed, once the AI gets good enough, the value of pictures and videos will plummet to zero.

This just isn't true. They will still be used to sexualise people, mostly girls and women, against their consent. It's no different from AI-generated child pornography. It does harm even if no 'real' people appear in the images.

Fucking horrible world we're forced to live in. Where's the fucking exit?

[-] GreatGrapeApe@reddthat.com 18 points 2 years ago

It is different than AI-generated CSAM because real people are actually being harmed by these deepfake images.

[-] JoBo@feddit.uk 15 points 2 years ago* (last edited 2 years ago)

I was replying to someone who was claiming they aren't harmful as long as everyone knows they're fake. Maybe nitpick them, not me?

Reak kids are harmed by AI CSAM normalising a problem they should be seeking help for, not getting off on.

[-] GreatGrapeApe@reddthat.com 3 points 2 years ago

Im addressing you because you made the claim they are equivalent when they clearly are not.

[-] JoBo@feddit.uk 10 points 2 years ago* (last edited 2 years ago)

No I didn't. Go nitpick someone else.

Or better still, explain why you think AI-generated CSAM isn't harmful. FFS

[-] SharkEatingBreakfast@sopuli.xyz 8 points 2 years ago* (last edited 2 years ago)

Let's be real here:

Sure, it's not illegal. But if I find "those kinds" of AI-generated images on someone's phone or computer, the fact that it's AI-generated will not improve my view of that person in any possible way.

Even if it's technically "legal".

They tellin' on themselves.

[-] Ataraxia@sh.itjust.works 5 points 2 years ago

People who consume any kind of cp are dangerous and encouraging thar behavior is just as criminal. I'm glad that shit is illegal in most civilized countries.

load more comments (3 replies)
[-] taladar@feddit.de 36 points 2 years ago

To verify if something is real, we might have to rely on witness testimony.

This is not going to work. Just because images and videos become less reliable that doesn't mean we will forget about the fact that eyewitness testimony is very unreliable.

[-] Khanzarate@lemmy.world 25 points 2 years ago

You say "forget" like it's not still incredibly common as evidence.

There's lots of data showing that eyewitnesses aren't reliable but that doesn't mean courts actually stopped relying on it. Ai making another form of evidence untrustworthy will result in eyewitnesses taking its place.

[-] hansl@lemmy.world 18 points 2 years ago

A bit off topic, but I wonder if the entertainment industry as a whole is going to be completely destroyed by AI when it gets good enough.

I can totally see myself prompting “a movie about love in the style of Star Wars, with Ryan Gosling and Audrey Hepburn as the leads, directed by Alfred Hitchcock, written by Vincent Hugo.” And then what? It’s game over for any content creation.

Curious if I’ll see that kind of power at home (using open source tools) in my lifetime.

[-] Benj1B@sh.itjust.works 9 points 2 years ago

I envisage a world where your browsing Netflix, and based on past preferences some of the title cards are generated on the fly for you. Then based on what you click, the AI engine warms us and generates the film for you in real-time. Essentially indistinguishable from the majority of Hollywood regurgitation.

And because the script is just a series of autogenerated prompts, its like a choose your own adventure book, you can steer the narrative the way you want if you elect to. Otherwise it'll be good enough to keep most monkey brains happy and you won't even be able to tell the difference most of the time.

[-] Rootiest@lemmy.world 7 points 2 years ago

Then the real money will be in hipster retro human-generated movies

[-] Gsus4@feddit.nl 4 points 2 years ago

And it will work, because we've grown used to Hollywood being so repetitive.

[-] Zoomboingding@lemmy.world 8 points 2 years ago

I know it's impossible to perfectly predict future technology, but I believe AI will exist alongside traditional filmmaking. You'll NEVER get something with the emotional impact of Up or Schindler's List from an AI. You'll be able to make fun action or fantasy movies though, and like you said, fully customized for the viewer. I imagine it'll be like CGI vs traditional animation now - you only see the latter for passion projects, but for most uses, CGI works well enough.

load more comments (4 replies)
[-] sv1sjp@lemmy.world 10 points 2 years ago

Thats why we need Blockchain Technology..

Check Blockchain Camera for example: https://github.com/sv1sjp/Blockchain_Camera

Abstract:


Blockchain Camera provides an easy and safe way to capture and guarantee the existence of videos reducing the impact of modified videos as it can preserve the integrity and validity of videos using Blockchain Technology. Blockchain Camera sends to Ethereum Network the hash of each video and the time the video has been recorded in order to be able validate that a video is genuine and hasn't been modified using a Blockchain Camera Validation Tool.
[-] xigoi@lemmy.sdf.org 17 points 2 years ago

How exactly does that prevent someone from uploading a fake video?

[-] sv1sjp@lemmy.world 8 points 2 years ago

The point is to know the time that a video has been uploaded as well as the previous and next videos from it for uses as security cameras, accidents in cars etc to be able to trust a video. (More information can be found on paper).

[-] getoffthedrugsdude@lemmy.ml 5 points 2 years ago

It won't, you'll just be able to verify a source

[-] taladar@feddit.de 15 points 2 years ago

Not even that. It only allows you to verify that the source is identical to (the potentially wrong information) that was claimed at the time of recording by the person adding that information to the block chain. Blockchain, as usual, adds nothing here.

[-] fiah@discuss.tchncs.de 6 points 2 years ago

Blockchain, as usual, adds nothing here.

it can add trust. If there's a trusted central authority where these hashes can be stored then there's no need for a blockchain. However, if there isn't, then a blockchain could be used instead, as long as it's big and established enough that everybody can agree that the data stored on it cannot be manipulated

[-] nudnyekscentryk@szmer.info 12 points 2 years ago

but false, nonconsensual nudes are not collectible items that need to have their authenticity proven. they are there to destroy peoples' lives. even if 99% of people seeing your nude believe you it's not authnetic, it still affects you heavily

[-] fiah@discuss.tchncs.de 6 points 2 years ago

nonconsensual nudes are not collectible items that need to have their authenticity proven

of course not, but that's not what this comment thread is about. It's about this:

Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

that's where it can be very useful to store a fingerprint of a file in a trusted database, regardless of where that database gets its trust from

[-] nudnyekscentryk@szmer.info 2 points 2 years ago

sure, but again: why woudl anyone like to do that with consensual or nonconsensual nudes?

[-] fiah@discuss.tchncs.de 4 points 2 years ago

that is not what this comment thread is about

[-] nudnyekscentryk@szmer.info 3 points 2 years ago

it very much is:

OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

parent reply: Thats why we need Blockchain Technology

[-] fiah@discuss.tchncs.de 5 points 2 years ago* (last edited 2 years ago)

a discussion can have multiple, separate threads with branching topics, that's what this threaded comment system is specifically made to facilitate

[-] nudnyekscentryk@szmer.info 3 points 2 years ago

okay, let's rethread how we got here:

OP: Spanish girls report AI porn of them circulating

parent comment: Blockchain could fix this

1-st level reply: Blockchain can't counteract fake porn being created

2-nd level reply: it lets you verify original source

3-rd level reply: if anything it lets you verify integrity between sources

you: if a central authority can't be trusted to verify sources then Blockchain can

me: it's not about verifying provenance of the material but rather its mere existence in the world

you: we can store the fingerprint of the file in a trusted database

me: but this doesn't affect the material's existence

you: you're going off-topic!

me: I am not

you: this conversation can have multiple threads

can you now see how it's you who's off the rails in this conversation? noone ever questioned how blockchain could allow verifying any piece's of media authenticity, but spreading forged, nonconsensual erotica is NOT about proving whether a photo or video in question is authentic; the problem is that people have got tools to do so in the first place, and before a victim can counteract and prove (using blockchain if you will) that a particular photo is a forgery, the damage is done regardless

[-] fiah@discuss.tchncs.de 3 points 2 years ago

okay, let’s rethread how we got here:
OP: Spanish girls report AI porn of them circulating
parent comment: Blockchain could fix this

you're missing a step there, buddy. I know, it's hard, let me make it a bit easier for you by drawing a picture:

"blockchain can fix this" was never about preventing AI porn from being spread, it's about the general problem of knowing whether something was authentic, hence their choice to reply to that comment with that article

[-] nudnyekscentryk@szmer.info 3 points 2 years ago

Again, for the sixth or whichever time: this has nothing to do with the clou of the problem

[-] fiah@discuss.tchncs.de 3 points 2 years ago

yes, you're right, it doesn't, because we weren't talking about that. "blockchain" can't do anything to help kids from having AI generated naked pictures of them being spread, and nobody here claimed otherwise

load more comments (1 replies)
[-] papertowels@lemmy.one 3 points 2 years ago

....you're right, it has nothing to do with nudes because it's talking about an entirely different problem of court-admissable evidence.

[-] devils_advocate@lemmy.ml 4 points 2 years ago

It proves that the video could not have been created at a later time.

[-] nudnyekscentryk@szmer.info 2 points 2 years ago

yeah but the problem is mere existance of tools allowing pornographic forgery, not verifying whether the material is real or not

[-] Gsus4@feddit.nl 6 points 2 years ago* (last edited 2 years ago)

How is that better than an immutable database where you guarantee trust simply by gettin your own public hash receipt for the database every time you introduce a new item? Why obfuscate things by riding the "Blockchain" hype bandwagon?

[-] papertowels@lemmy.one 2 points 2 years ago

Who manages and guarantees that immutable database?

load more comments (8 replies)
[-] hardware26@discuss.tchncs.de 6 points 2 years ago

Not necessarily, solutions can implemented. For example, footage from private security cameras can be sent to trusted establishment (trusted by the court at least) in real time which can be timestamped and stored (maybe not necessarily even stored there, encryption with timestamp may be enough). If source private camera and the network is secure, footage is also secure.

[-] Benj1B@sh.itjust.works 3 points 2 years ago

Network security is a pretty big ask though - just look at how many unsecured cameras are around now. And once an attacker is in anything generated on that network becomes suspect - how do you know the security camera feed wasn't intercepted, manipulated, or replaced altogether?

[-] lambalicious@lemmy.sdf.org 2 points 2 years ago

To verify if something is real, we might have to rely on ~~witness testimony~~ flagrancy.

FTFY. Witness has never been that good a means to verify something is real.

load more comments (1 replies)
[-] Seudo@lemmy.world 18 points 2 years ago

Same goes for any deepfake. People are loosing their shit because we won't know what's real and what's not!.

We should have been teaching critical thinking a generation ago. Sagan was pleading for reform in the 90s. We can start teaching the next generation how to navigate the Information Age. What we can't do is make the world childproof.

[-] Cethin@lemmy.zip 10 points 2 years ago

Yeah, what I see happening is people end up not caring as much because there's going to be so much plausible AI generated crap that any real stuff will be lost in the noise.

[-] Turun@feddit.de 4 points 2 years ago

Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.

[-] aard@kyu.de 8 points 2 years ago

Fang mit dem relativ neuen Fall hier an, und von da solltest du dann genug Info haben um selber zu suchen was die letzten Jahre passiert ist - das ist exakt das wovor damals gewarnt wurde, aber wer den hysterischen Irren die alles was irgendwie mit "Teenager entdecken Sexualitaet" mit dem Strafrecht erschlagen wollen mit durchdachten Argumenten kommt ist dann ja direkt auch ein Paedophiler.

https://www.swr.de/swraktuell/rheinland-pfalz/koblenz/lehrerin-kinderpornografischer-inhalte-konfisziert-deswegen-angeklagt-100.html

this post was submitted on 19 Sep 2023
650 points (100.0% liked)

Europe

8332 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out !yurop@lemm.ee

founded 2 years ago
MODERATORS