358

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

top 50 comments
sorted by: hot top controversial new old
[-] guyrocket@kbin.social 127 points 8 months ago

This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me. The result is the same: fake porn/nudes.

And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

[-] kent_eh@lemmy.ca 111 points 8 months ago

People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me

Because now it's faster, can be generated in bulk and requires no skill from the person doing it.

[-] ArmokGoB@lemmy.dbzer0.com 42 points 8 months ago

I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

load more comments (3 replies)
[-] Bob_Robertson_IX@discuss.tchncs.de 32 points 8 months ago

A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn't require any skills that a 1st grader doesn't have.

[-] ChexMax@lemmy.world 29 points 8 months ago

Those are easily disproven. There's no way you think that's the same thing. If you can pull up the source photo and it's a clear match/copy for the fake it's easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

load more comments (1 replies)
[-] nudnyekscentryk@szmer.info 18 points 8 months ago

But now they are photo realistic

load more comments (1 replies)
load more comments (3 replies)
[-] echo64@lemmy.world 87 points 8 months ago

I hate this: "Just accept it women of the world, accept the abuse because it's the new normal" techbro logic so much. It's absolutely hateful towards women.

We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

[-] AquaTofana@lemmy.world 55 points 8 months ago

I don't know why you're being down voted. Sure, it's unfortunately been happening for a while, but we're just supposed to keep quiet about it and let it go?

I'm sorry, putting my face on a naked body that's not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it's actually believable because it's AI generated. That is SO much worse/psychologically damaging if they find out about it.

[-] brbposting@sh.itjust.works 17 points 8 months ago

It’s unacceptable.

We have legal and justice systems to deal with this.

For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

Telegram got right on it (not). Fuckers.

load more comments (9 replies)
[-] EatATaco@lemm.ee 22 points 7 months ago

I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family's faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

I agree there is nothing to be done, but it's painfully obvious to me that the scale and ease of it that makes it much more concerning.

load more comments (1 replies)
[-] AstralPath@lemmy.ca 21 points 8 months ago

This kind of attitude toward non-consensual actions is what perpetuates them. Fuck that shit.

[-] Assman@sh.itjust.works 20 points 8 months ago

The same reason AR15 rifles are different than muskets

load more comments (15 replies)
[-] RobotToaster@mander.xyz 119 points 8 months ago

This is only going to get easier. The djinn is out of the bottle.

[-] goldteeth@lemmy.dbzer0.com 79 points 8 months ago* (last edited 8 months ago)

"Djinn", specifically, being the correct word choice. We're way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We're back into fuckin'... shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

load more comments (2 replies)
[-] conciselyverbose@sh.itjust.works 43 points 8 months ago

Doesn't mean distribution should be legal.

People are going to do what they're going to do, and the existence of this isn't an argument to put spyware on everyone's computer to catch it or whatever crazy extreme you can take it to.

But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

[-] treadful@lemmy.zip 16 points 7 months ago

Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

load more comments (3 replies)
load more comments (9 replies)
[-] GrymEdm@lemmy.world 74 points 7 months ago* (last edited 7 months ago)

To people who aren't sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There's also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it's like watching video the next day of yourself undergoing sex without consent as if you'd been drugged.

I'll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can't make that decision for others or purge the internet, but the fact that there's such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

I get that people say this is the new normal, but it's already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

[-] lud@lemm.ee 20 points 7 months ago

once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

Not saying that they are justified or anything but wouldn't people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

[-] eatthecake@lemmy.world 24 points 7 months ago

The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you'll get a whole lot of complex PTSD instead.

[-] stephen01king@lemmy.zip 22 points 7 months ago

People used to think their lives are over if they were caught alone with someone of the opposite sex they're not married to. That is no longer the case in western countries due to normalisation.

The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

load more comments (2 replies)
load more comments (3 replies)
[-] JackGreenEarth@lemm.ee 67 points 8 months ago

That's a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they're purposefully exploiting people who aren't tech savvy.

[-] Khrux@ttrpg.network 63 points 8 months ago* (last edited 8 months ago)

I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

I wish everyone involved in this use of AI a very awful day.

[-] sentient_loom@sh.itjust.works 15 points 8 months ago

Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.

[-] echo64@lemmy.world 49 points 8 months ago

The people being exploited are the ones who are the victims of this, not people who paid for it.

load more comments (9 replies)
[-] istanbullu@lemmy.ml 43 points 8 months ago

it's a "I don't know tech" tax

[-] oce@jlai.lu 29 points 8 months ago

That's like 80% of the IT industry.

[-] sugar_in_your_tea@sh.itjust.works 27 points 8 months ago

IDK, $10 seems pretty reasonable to run a script for someone who doesn't want to. A lot of people have that type of arrangement for a job...

That said, I would absolutely never do this for someone, I'm not making nudes of a real person.

[-] IsThisAnAI@lemmy.world 15 points 8 months ago* (last edited 8 months ago)

Scam is another thing. Fuck these people selling.

But fuck dude they aren't taking advantage of anyone buying the service. That's not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

NOBODY on that side of the equation are bring exploited 🤣

load more comments (8 replies)
[-] flower3@feddit.de 59 points 8 months ago

I doubt tbh that this is the most severe harm of generative AI tools lol

[-] Sanctus@lemmy.world 24 points 8 months ago

Pretty sure we will see fake political candidates that actually garner votes soon here.

[-] SnotFlickerman 15 points 8 months ago

The Waldo Moment manifest.

load more comments (1 replies)
[-] Luisp@lemmy.dbzer0.com 20 points 8 months ago

Israeli racial recognition program for example

load more comments (7 replies)
load more comments (1 replies)
[-] General_Effort@lemmy.world 45 points 8 months ago* (last edited 8 months ago)

Porn of Normal People

Why did they feel the need to add that "normal" to the headline?

[-] sentient_loom@sh.itjust.works 59 points 8 months ago

To differentiate from celebrities.

load more comments (2 replies)
[-] echo64@lemmy.world 37 points 8 months ago

Every time this comes up, all the tech nerds here like to excuse it as fine and not a bad thing at all. I am hoping this won't happen this time, but knowing lemmys audience...

[-] sbv@sh.itjust.works 19 points 8 months ago

The Lemmy circlejerk is real, but excusing deep fake porn is pretty off brand for us. I'm glad the comments on this post are uniformly negative.

load more comments (3 replies)
load more comments (2 replies)
[-] SendMePhotos@lemmy.world 29 points 7 months ago

I'd like to share my initial opinion here. "non consential Ai generated nudes" is technically a freedom, no? Like, we can bastardize our president's, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

[-] LadyAutumn 23 points 7 months ago* (last edited 7 months ago)

They're making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.

Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren't taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It's fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It's terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.

[-] cley_faye@lemmy.world 16 points 7 months ago

It absolutely can and must be illegal to do this.

Given that it can be done in a private context and there is absolutely no way to enforce it without looking into random people's computer unless they post it online publicly, you're just asking for a new law to reassure people with no effect. That's useless.

load more comments (6 replies)
[-] Maggoty@lemmy.world 21 points 7 months ago

It's a far cry from making weird memes to making actual porn. Especially when it's not easily seen as fake.

[-] antlion@lemmy.dbzer0.com 15 points 7 months ago

Seems to fall under any other form of legal public humiliation to me, UNLESS it is purported to be true or genuine. I think if there’s a clear AI watermark or artists signature that’s free speech. If not, it falls under Libel - false and defamatory statements or facts, published as truth. Any harmful deep fake released as truth should be prosecuted as Libel or Slander, whether it’s sexual or not.

load more comments (11 replies)
[-] anticurrent@sh.itjust.works 17 points 7 months ago

We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

load more comments
view more: next ›
this post was submitted on 29 Mar 2024
358 points (100.0% liked)

Technology

59598 readers
3387 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS