917

Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

top 50 comments
sorted by: hot top controversial new old
[-] Max_P@lemmy.max-p.me 165 points 8 months ago

Seen similar stuff on TikTok.

That's the big problem with ad marketplaces and automation, the ads are rarely vetted by a human, you can just give them money, upload your ad and they'll happily display it. They rely entirely on users to report them which most people don't do because they're ads and they wont take it down unless it's really bad.

[-] Evilcoleslaw@lemmy.world 55 points 8 months ago* (last edited 8 months ago)

It's especially bad on reels/shorts for pretty much all platforms. Tons of financial scams looking to steal personal info or worse. And I had one on a Facebook reel that was for boner pills that was legit a minute long ad of hardcore porn. Not just nudity but straight up uncensored fucking.

load more comments (15 replies)
[-] Nobody@lemmy.world 130 points 8 months ago

It’s all so incredibly gross. Using “AI” to undress someone you know is extremely fucked up. Please don’t do that.

[-] TheBat@lemmy.world 51 points 8 months ago

I'm going to undress Nobody. And give them sexy tentacles.

load more comments (1 replies)
[-] MonkderDritte@feddit.de 18 points 8 months ago* (last edited 8 months ago)

Same vein as "you should not mentally undress the girl you fancy". It's just a support for that. Not that i have used it.

Don't just upload someone else's image without consent, though. That's even illegal in most of europe.

[-] uriel238 90 points 8 months ago

It remains fascinating to me how these apps are being responded to in society. I'd assume part of the point of seeing someone naked is to know what their bits look like, while these just extrapolate with averages (and likely, averages of glamor models). So we still dont know what these people actually look like naked.

And yet, people are still scorned and offended as if they were.

Technology is breaking our society, albeit in place where our culture was vulnerable to being broken.

[-] Beebabe@lemmy.world 36 points 8 months ago

Something like this could be career ending for me. Because of the way people react. “Oh did you see Mrs. Bee on the internet?” Would have to change my name and move three towns over or something. That’s not even considering the emotional damage of having people download you. Knowledge that “you” are dehumanized in this way. It almost takes the concept of consent and throws it completely out the window. We all know people have lewd thoughts from time to time, but I think having a metric on that…it would be so twisted for the self-image of the victim. A marketplace for intrusive thoughts where anyone can be commodified. Not even celebrities, just average individuals trying to mind their own business.

[-] captainlezbian@lemmy.world 14 points 8 months ago

Exactly. I’m not even shy, my boobs have been out plenty and I’ve sent nudes all that. Hell I met my wife with my tits out. But there’s a wild difference between pictures I made and released of my own will in certain contexts and situations vs pictures attempting to approximate my naked body generated without my knowledge or permission because someone had a whim.

load more comments (5 replies)
[-] inb4_FoundTheVegan@lemmy.world 30 points 8 months ago* (last edited 8 months ago)

Wtf are you even talking about? People should have the right to control if they are "approximated" as nude. You can wax poetic how it's not nessecarily correct but that's because you are ignoring the woman who did not consent to the process. Like, if I posted a nude then that's on the internet forever. But now, any picture at all can be made nude and posted to the internet forever. You're entirely removing consent from the equation you ass.

load more comments (6 replies)
[-] echodot@feddit.uk 17 points 8 months ago

I suspect it's more affecting for younger people who don't really think about the fact that in reality, no one has seen them naked. Probably traumatizing for them and logic doesn't really apply in this situation.

[-] StitchIsABitch@lemmy.world 29 points 8 months ago

Does it really matter though? "Well you see, they didn't actually see you naked, it was just a photorealistic approximation of what you would look like naked".

At that point I feel like the lines get very blurry, it's still going to be embarrassing as hell, and them not being "real" nudes is not a big comfort when having to confront the fact that there are people masturbating to your "fake" nudes without your consent.

I think in a few years this won't really be a problem because by then these things will be so widespread that no one will care, but right now the people being specifically targeted by this must not be feeling great.

load more comments (1 replies)
[-] BreakDecks@lemmy.ml 17 points 8 months ago

The draw to these apps is that the user can exploit anyone they want. It's not really about sex, it's about power.

[-] uriel238 14 points 8 months ago

Human society is about power. It is because we can't get past dominance hierarchy that our communities do nothing about schoolyard bullies, or workplace sexual harassment. It is why abstinence-only sex-ed has nothing positive to say to victims of sexual assault, once they make it clear that used goods are used goods.

Our culture agrees by consensus that seeing a woman naked, whether a candid shot, caught inflagrante delicto or rendered from whole cloth by a generative AI system, redefines her as a sexual object, reducing her qualifications as a worker, official or future partner. That's a lot of power to give to some guy with X-ray Specs, and it speaks poorly of how society regards women, or human beings in general.

We disregard sex workers, too.

Violence sucks, but without the social consensus the propagates sexual victimhood, it would just be violence. Sexual violence is extra awful because the rest of society actively participates in making it extra awful.

load more comments (1 replies)
load more comments (33 replies)
[-] ClusterBomb 79 points 8 months ago

Intersting how we can "undress any girl" but I have not seen a tool to "undress any boy" yet. 😐

I don't know what it says about people developing those tools. (I know, in fact)

[-] r00ty@kbin.life 60 points 8 months ago

Make one :P

Then I suspect you'll find the answer is money. The ones for women simply just make more money.

load more comments (1 replies)
[-] Flipper@feddit.de 30 points 8 months ago

I've seen a tool like that. Everyone was a bodybuilder and Hung like a horse.

[-] Peppycito@sh.itjust.works 34 points 8 months ago

I'm going to guess all the ones of women have bolt on tiddies and no pubic hair.

load more comments (1 replies)
load more comments (7 replies)
[-] starman@programming.dev 21 points 8 months ago* (last edited 8 months ago)

Be the change you wish to see in the world

\s

[-] echodot@feddit.uk 16 points 8 months ago

You probably don't need them. You can get these photos without even trying. Is a bit of a problem really.

load more comments (2 replies)
[-] LadyAutumn 59 points 8 months ago

Lot of people in this thread who don't seem to understand what sexual exploitation is. I've argued about this exact subject on threads like this before.

It is absolutely horrifying that someone you know could take your likeness and render it into a form for their own sexual gratification. It doesn't matter that it's ai rendered. The base image is still you, the face in the image is still your face, and you are still the object being sexualized. I can't describe how disgusting that is. If you do not see the problem in that I don't know what to tell you. This will be used on images of normal non-famous women. It will be used on pictures from the social media profiles of teenage girls. These ads were on a platform with millions of personal accounts of women and girls. It's sickening. There is no consent involved here. It's non-consensual pornography.

load more comments (9 replies)
[-] callouscomic@lemm.ee 45 points 8 months ago* (last edited 8 months ago)

So many of these comments are breaking down into arguments of basic consent for pics, and knowing how so many people are, I sure wonder how many of those same people post pics of their kids on social media constantly and don't see the inconsistency.

[-] Grandwolf319@sh.itjust.works 23 points 8 months ago

There isn’t really many good reasons to post your kid’s picture anyway.

load more comments (1 replies)
[-] Tylerdurdon@lemmy.world 40 points 8 months ago

AI gives creative license to anyone who can communicate their desires well enough. Every great advancement in the media age has been pushed in one way or another with porn, so why would this be different?

I think if a person wants visual "material," so be it. They're doing it with their imagination anyway.

Now, generating fake media of someone for profit or malice, that should get punishment. There's going to be a lot of news cycles with some creative perversion and horrible outcomes intertwined.

I'm just hoping I can communicate the danger of some of the social media platforms to my children well enough. That's where the most damage is done with the kind of stuff.

[-] abhibeckert@lemmy.world 38 points 8 months ago* (last edited 8 months ago)

The porn industry is, in fact, extremely hostile to AI image generation. How can anyone make money off porn if users simply create their own?

Also I wouldn't be surprised if the it's false advertising and in clicking the ad will in fact just take you to a webpage with more ads, and a link from there to more ads, and more ads, and so on until eventually users either give up (and hopefully click on an ad).

Whatever's going on, the ad is clearly a violation of instagram's advertising terms.

I’m just hoping I can communicate the danger of some of the social media platforms to my children well enough. That’s where the most damage is done with the kind of stuff.

It's just not your children you need to communicate it to. It's all the other children they interact with. For example I know a young girl (not even a teenager yet) who is being bullied on social media lately - the fact she doesn't use social media herself doesn't stop other people from saying nasty things about her in public (and who knows, maybe they're even sharing AI generated CSAM based on photos they've taken of her at school).

load more comments (5 replies)
[-] evlogii@lemm.ee 38 points 8 months ago* (last edited 8 months ago)

Isn't it kinda funny that the "most harmful applications of AI tools are not hidden on the dark corners of the internet," yet this article is locked behind a paywall?

[-] EdibleFriend@lemmy.world 38 points 8 months ago

youtube has been for like 6 or 7 months. even with famous people in the ads. I remember one for a while with Ortega

[-] CrayonRosary@lemmy.world 14 points 8 months ago

Ortega? The taco sauce?

NSFW-ish

load more comments (3 replies)
[-] _sideffect@lemmy.world 37 points 8 months ago

Good, let all celebs come together and sue zuck into the ground

load more comments (2 replies)
[-] BreakDecks@lemmy.ml 25 points 8 months ago

Sharing this screenshot again, to drive the point home.

[-] LucidBoi@lemmy.dbzer0.com 21 points 8 months ago

What in the fuck are all these photos of kids? They're not part of the ad?

[-] BreakDecks@lemmy.ml 44 points 8 months ago

This was from a test I did with a throwaway account on IG where I followed a handful of weirdo parents who run "model" accounts for their kids to see if Instagram would start pushing problematic content as a result (spoiler: yes they will).

It took about 5 minutes from creating the account to end up with nothing but dressed down kids on my recommendations page paired with inappropriate ads. I guess the people who follow kids on IG also like these recommended photos, and the algorithm also figures they must be perverts, but doesn't care about the sickening juxtaposition of children in swimsuits next to AI nudifying apps.

Don't use Meta products. They don't care about ethics, just profits.

[-] Karyoplasma@discuss.tchncs.de 22 points 8 months ago* (last edited 8 months ago)

The other day, I had an ad on facebook that was basically lolicon. It depicted a clearly underage anime girl in a sexually suggestive position on a motorcycle with their panties almost off. I am in Germany, Facebook knows I am in Germany and if I took a screenshot of that ad and saved it, it would probably be classed as CSAM in my jurisdiction. I reported the ad and got informed that FB found "nothing wrong" with it a few days later. Fuck off, you child predators.

load more comments (1 replies)
load more comments (4 replies)
load more comments (5 replies)
[-] Kedly@lemm.ee 24 points 8 months ago

ITT: A bunch of creepy fuckers who dont think society should judge them for being fucking creepy

[-] MareOfNights@discuss.tchncs.de 23 points 8 months ago

Am I the only one who doesn't care about this?

Photoshop has existed for some time now, so creating fake nudes just became easier.

Also why would you care if someone jerks off to a photo you uploaded, regardless of potential nude edits. They can also just imagine you naked.

If you don't want people to jerk off to your photos, don't upload any. It happens with and without these apps.

But Instagram selling apps for it is kinda fucked, since it's very anti-porn, but then sells apps for it (to children).

[-] YungOnions@sh.itjust.works 42 points 8 months ago* (last edited 8 months ago)

It's about consent. If you have no problem with people jerking off to your pictures, fine, but others do.

If you don't want people to jerk off to your photos, don't upload any. It happens with and without these apps.

You get that that opinion is pretty much the same as those who say if she didn't want to be harrassed she shouldn't have worn such provocative clothing!?

How about we allow people to upload whatever pictures they want and try to address the weirdos turning them into porn without consent, rather than blaming the victims?

load more comments (4 replies)
[-] Cris_Color@lemmy.world 28 points 8 months ago

I think it's clear you have never experienced being sexualized when you weren't okay with it. It's a pretty upsetting experience that can feel pretty violating. And as most guys rarely if ever experience being sexualized, never mind when they don't want to be, I'm not surprised people might be unable to emphasize

Having experienced being sexualized when I wasn't comfortable with it, this kind of thing makes me kinda sick to be honest. People are used to having a reasonable expectation that posting safe for work pictures online isn't inviting being sexualized. And that it would almost never be turned into pornographic material featuring their likeness, whether it was previously possible with Photoshop or not.

It's not surprising people would find the loss of that reasonable assumption discomforting given how uncomfortable it is to be sexualized when you don't want to be. How uncomfortable a thought it is that you can just be going about your life and minding your own business, and it will now be convenient and easy to produce realistic porn featuring your likeness, at will, with no need for uncommon skills not everyone has

load more comments (2 replies)
[-] InternetPerson@lemmings.world 15 points 8 months ago

Also why would you care if someone jerks off to a photo you uploaded, regardless of potential nude edits. They can also just imagine you naked.

Imagining and creating physical (even digial) material are different levels of how real and tangible it feels. Don't you think?

There is an active act of carefully editing those pictures involved. It's a misuse and against your intention when you posted such a picture of yourself. You are loosing control by that and become unwillingly part of the sexual act of someone else.

Sure, those, who feel violated by that, might also not like if people imagine things, but that's still a less "real" level.

For example: Imagining to murder someone is one thing. Creating very explicit pictures about it and watching them regularly, or even printing them and hanging them on the walls of one's room, is another.
I don't want to equate murder fantasies with sexual ones. My point is to illustrate that it feels to me and obviously a lot of other people that there are significant differences between pure imagination and creating something tangible out of it.

load more comments (1 replies)
load more comments (1 replies)
[-] MonkderDritte@feddit.de 21 points 8 months ago

That bidding model for ads should be illegal. Alternatively, companies displaying them should be responsible/be able to tell where it came from. Misinformarion has become a real problem, especially in politics.

[-] Sami_Uso@lemmy.world 20 points 8 months ago

Capitalism works! It breeds innovation like this! good luck getting non consensual ai porn in your socialist government

[-] Wispy2891@lemmy.world 18 points 8 months ago

Something that can also happen: require Facebook login with some excuse, then blackmail the creeps by telling "pay us this extortion or we're going to send proof of your creepiness to your contacts"

load more comments
view more: next ›
this post was submitted on 23 Apr 2024
917 points (100.0% liked)

Technology

60042 readers
2261 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS