196
submitted 8 months ago by MicroWave@lemmy.world to c/news@lemmy.world

Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

"I'd already heard about deepfakes and deepnudes (...) but I wasn't really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people's lives, but it wouldn't happen in mine", thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? "We wonder which photo would best resemble you", she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

all 43 comments
sorted by: hot top controversial new old
[-] nexusband@lemmy.world 44 points 8 months ago

This is going to be a serious issue in the future - either society changes and these things are going to be accepted or these kind of generating ai models have to be banned. But that's still not going to be a "security" against it...

I also think we have to come up with digital watermarks that are easy to use...

[-] jlh@lemmy.jlh.name 29 points 8 months ago

Honestly, I see it as kinda freeing. Now people don't have to worry about nudes leaking any more, since you can just say they're fake. Somebody starts sending around deepfakes of me? OK, whatever, weirdo, it's not real.

[-] DadVolante@sh.itjust.works 34 points 8 months ago

I'm guessing it's easier to feel that way if your name is Justin.

If it was Justine, you might have issues.

Weird how that works.

[-] jlh@lemmy.jlh.name 19 points 8 months ago

Fair enough. Ideally it would be the same for women too, but we're not there as a society yet.

[-] steakmeoutt@sh.itjust.works 12 points 8 months ago* (last edited 8 months ago)

Such an empty response. Do you know that women have to do things on dates out of fear of being killed? Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

Society moving forward is a nice idea, women feeling safe is much better one and attitudes like yours are part of the reason women generally do not feel safe. Deepfakes are not freeing at all.

[-] zzx@lemmy.world 9 points 8 months ago
[-] fidodo@lemmy.world 20 points 8 months ago

I think there's a big difference between creating them and spreading them, and putting punishments on spreading nudes against someone's will, real or fake is a better 3rd option. The free speech implications of banning software that's capable of creating them is too broad and fuzzy, but I think that putting harsh penalties on spreading them on the grounds of harassment would be clear cut and effective. I didn't see a big difference in between spreading revenge porn and deep fakes and we already have laws against spreading revenge porn.

[-] tsonfeir@lemm.ee 20 points 8 months ago* (last edited 8 months ago)

We gotta ban photo editing software too. Shit, we gotta ban computers entirely. Shit, now we have to ban electricity.

[-] Ryzen11v@ani.social 23 points 8 months ago* (last edited 8 months ago)

I'm so tired of this "Don't blame the tool" bs argument used to divert responsibility.

Blame the fucking tool and restrict it.

[-] fidodo@lemmy.world 19 points 8 months ago

Why not blame the spread? You can't ban the tool, it's easily accessible software and that only requires easily accessible consumer hardware, and you can even semi easily train your own models using easily accessible porn on the Internet, so if you want to ban it outright, you'd need to ban the general purpose tool, all porn, and the knowledge to train image generation models. If you mean ban the online apps that sell the service on the cloud, I can get behind that, it would increase the bar to create them a little, but that is far from a solution.

But, we already have laws against revenge porn and Internet harassment. I think the better and more feasible approach that doesn't have far reaching free speech implications would be to simply put heavy penalties on spreading nudes images of people against their will, whether those images are real or fake. It's harassment as revenge porn, and I didn't see how it's different if it's a realistic fake. If there is major punishment for spreading these images then I think that will take care of discouraging the spread of the images for the vast majority of people.

[-] tsonfeir@lemm.ee 8 points 8 months ago

Social media is a tool used to spread misinformation. Should social media be banned?

[-] Ryzen11v@ani.social 19 points 8 months ago
[-] tsonfeir@lemm.ee 12 points 8 months ago

So delete your account.

[-] nexusband@lemmy.world 7 points 8 months ago

Social media as a business model? Yes, absolutely.

[-] tsonfeir@lemm.ee 9 points 8 months ago

Is open source AI image generation a business model?

[-] fidodo@lemmy.world 6 points 8 months ago

The companies that host and sell an online image to nude service using a tuned version of that tool specifically designed to convert images into nudes are definitely a business model.

I agree it's impractical and opens dangerous free speech problems to try and ban or regulate the general purpose software, but, I don't have a problem with regulating for profit online image generation services that have been advertising the ability to turn images into nudes and have even been advertising their service on non porn sites. Regulating those will at least raise the bar a bit and ensure that there's isn't a for profit motive where capitalism will encourage it happening even more.

We already have revenge porn laws that outlaw the spread of real nudes against someone's will, I don't see why the spread of fakes shouldn't be outlaws similarly.

[-] tsonfeir@lemm.ee 2 points 8 months ago

And I think if those companies can be identified as making the offending image, they should be help liable. IMO, you shouldn’t be able to use a photo without the permission of the person.

[-] littlebluespark@lemmy.world 6 points 8 months ago

Blame the fucking tool and restrict it.

I mean. It's worked so well with you so far, why not?

[-] CommanderCloon@lemmy.ml 2 points 8 months ago
[-] tsonfeir@lemm.ee 3 points 8 months ago

Pencils are the tool of Satan!!!

[-] Lemming6969@lemmy.world 5 points 8 months ago

With ai and digital art... What is real? What is a person? What is a cartoon or a similar but not same likeness? In some cases what even is nudity? How old is an ai image? How can anything then be legal or illegal?

[-] sir_pronoun@lemmy.world 28 points 8 months ago

I seriously don't get why society cares if there are photos of anyone's private parts.

[-] VeganCheesecake 38 points 8 months ago

I think we as a society are too uptight about nudity, but that doesn't mean that creating pictures of people without their consent, which make them feel uncomfortable, is in any way OK.

[-] tsonfeir@lemm.ee 26 points 8 months ago

I imagine those people are humiliated.

[-] damnthefilibuster@lemmy.world 34 points 8 months ago

They are humiliated only because society has fed them the idea that what they’ve done (in this case not done but happened to them) is wrong. Internalizing shame meted out by society is the real psychological problem we need to fix.

[-] tsonfeir@lemm.ee 13 points 8 months ago

Society does indeed play a big role, but if someone went around telling lies about you that everyone believed regardless of how much you denied it, that would take a toll on you.

[-] sir_pronoun@lemmy.world 10 points 8 months ago

That's what I meant. Why should it be shameful? If it weren't, those photos would lose so much of their harm.

[-] eatthecake@lemmy.world 8 points 8 months ago

Who are you tell people how they ought to feel? The desire for privacy is perfectly normal and you are the one trying to shame people for not wanting naked pictures of themselves everywhere.

[-] damnthefilibuster@lemmy.world 5 points 8 months ago

That’s fair.

[-] natecheese@kbin.melroy.org 17 points 8 months ago

I think the issue is that there is sexual imagery of the person being created and shared without that persons consent.

It's akin to taking nude photos of someone without their consent, or sharing nude photos with someone other than their intended audience.

Even if there were no stigma attached to nudes, that doesn't mean someone would their nudes to exist or be shared.

[-] prettybunnys@sh.itjust.works 12 points 8 months ago

I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.

[-] papertowels@programming.dev 7 points 8 months ago

Right??? I send strangers pictures of mine all the time!!

[-] werefreeatlast@lemmy.world 8 points 8 months ago

Someone at TikTok has all the power to make nudes off every one in the planet except for 5 homeless guys from LA that you don't want a nude from anyway. Tiktok has the images of you (you idiot) and the hardware and software required to fake you to everyone you know.

Welcome to China 2.0!

[-] VeganCheesecake 9 points 8 months ago

No, they don't. Neither has Instagram, to my knowledge they have two, posted by other people. Now Grindr on the other hand...

[-] werefreeatlast@lemmy.world 6 points 8 months ago

Oh I would expect Grindr to call this a feature....

You liked Jeff, but Jer, for an extra $7.53 you can automatically see him naked to reveal the full package!

[-] VeganCheesecake 5 points 8 months ago

The most unrealistic thing about this is the price. They'd want a twenty, minimum, I feel.

[-] graymess@lemmy.world 2 points 8 months ago

Pal, what the fuck are you talking about? TikTok and China are not mentioned anywhere in this article and nowhere on TikTok is there an option to generate anyone's likeness, clothed or unclothed.

this post was submitted on 16 Mar 2024
196 points (100.0% liked)

News

23367 readers
2452 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS