487
submitted 1 year ago by MicroWave@lemmy.world to c/news@lemmy.world

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

you are viewing a single comment's thread
view the rest of the comments
[-] SnotFlickerman 65 points 1 year ago* (last edited 1 year ago)

Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

The rights to famous people's "images" are bought and sold all the time.

I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be "owned" by anyone but yourself.

The fact that making a law like this isn't a priority means this will get worse because we already have a society and laws that don't respect our rights to control of our own image.

A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else's image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

[-] Dark_Arc@social.packetloss.gg 55 points 1 year ago* (last edited 1 year ago)

There are genuine reasons not to give people sole authority over their image though. "Oh that's a picture of me genuinely doing something bad, you can't publish that!"

Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

[-] Zachariah@lemmy.world 12 points 1 year ago

Seems like a typical copyright issue. The copyright owner has a monopoly on the intellectual property, but there are (genuine reasons) fair use exceptions (for journalism, satire, academic, backup, etc.)

[-] lolcatnip@reddthat.com 10 points 1 year ago

Reminder that the stated reason for copyrights to exist say all, per the US Constitution, is “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

Anything that occurs naturally falls outside the original rationale. We've experienced a huge expansion of the concept of intellectual property since then, but as far as I can tell there has never been a consensus on what purpose intellectual property rights are supposed to serve beyond the original conception.

[-] afraid_of_zombies@lemmy.world 2 points 1 year ago

Makes sense. If I do something worth taking a picture of that means I have zero rights to it since that is "natural", but the person who took the photo has all the rights to it.

Tell me this crap wasn't written for and by the worst garbage publishers out there.

[-] SnotFlickerman 5 points 1 year ago* (last edited 1 year ago)

Yeah I'm not stipulating a law where you can't be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

Making fake images whole cloth is.

The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don't have enough time for right this moment.

[-] afraid_of_zombies@lemmy.world 4 points 1 year ago

If you have a picture of someone doing something bad you really should be talking to law enforcement not Faceboot. If it isnt so bad that it is criminal I wonder why it is your concern?

[-] Dark_Arc@social.packetloss.gg 10 points 1 year ago

It's not just "taking it to law enforcement", it's a freedom of the press issue.

[-] afraid_of_zombies@lemmy.world 3 points 1 year ago

Can you address what I brought up?

[-] CommanderCloon@lemmy.ml 6 points 1 year ago

Public outrage more often drives justice for public figures than what law enforcement does on its own. The level of control you're asking for would simply nuke the press.

[-] afraid_of_zombies@lemmy.world 2 points 1 year ago

No. Public figures are not private figures.

[-] SuddenDownpour@sh.itjust.works 3 points 1 year ago

My experience with the police is that most of them will systematically ignore denounces up until the issue has already grown out of control. Outside of that, there are things that are unethical but not illegal, but you might want to denounce publicly anyway.

[-] afraid_of_zombies@lemmy.world 1 points 1 year ago

Ok so your plan is if you see someone do something illegal is to depend on faceboot

[-] SuddenDownpour@sh.itjust.works 2 points 1 year ago

If you complain that people don't address your point, and then someone addresses it in good faith, strawmanning them afterwards only makes you look like an asshole and encourages everyone else to not address you at all.

[-] afraid_of_zombies@lemmy.world 1 points 1 year ago

Not seeing the good faith.

[-] Dark_Arc@social.packetloss.gg 2 points 1 year ago

I don't have time for this...

[-] Zetta@mander.xyz 11 points 1 year ago

That sounds pretty dystopian to me. Wouldn't that make filming in public basically illegal?

[-] ParsnipWitch@feddit.de 11 points 1 year ago* (last edited 1 year ago)

In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn't feel dystopian at all, to be honest. I'd rather have it that way than ending up on someone's stupid vlog or whatever.

[-] afraid_of_zombies@lemmy.world 6 points 1 year ago

Many years ago I mentioned this on reddit. Complaining how photographers can just take pictures of you or your property and do what they want with it. Of course the group mind attacked me.

Problem just seems to get worse by the year.

[-] lolcatnip@reddthat.com 19 points 1 year ago

That's because your proposal would make photography de facto illegal, because getting the rights to everyone and everything that appears in a photograph would be virtually impossible. Hell, most other kinds of visual art would be essentially illegal as well. There would be hardly anything but abstract art.

[-] afraid_of_zombies@lemmy.world 8 points 1 year ago

Bullshit.

Taking a photo of yourself or your family at a public landmark? Legal.

Taking a photo of yourself or your family at a celebration? Legal.

Zooming in on the local Catholic school to get a shot of some 12 year olds and putting it on the internet? Illegal.

We need to stop pretending that photography isn't a thing and that there is zero expectation of privacy if someone can violate it. This is crap we see with police using infrared cameras to get around the need for warrants and the crap we see of people using drones to stalk. You have the right to be left the fuck alone and if someone wants to creep on teens well sorry you are out of luck.

[-] lolcatnip@reddthat.com 11 points 1 year ago

There are literally already cases where taking a photo of yourself in front of a public landmark is illegal because of copyright issues.

[-] CleoTheWizard@lemmy.world 3 points 1 year ago

The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

this post was submitted on 03 Dec 2023
487 points (100.0% liked)

News

23664 readers
3136 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS