446
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 18 Jan 2024
446 points (100.0% liked)
Technology
59334 readers
4116 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
I would have thought that deepfakes are defamation per se. The push to criminalize this is quite the break with American first amendment traditions.
If I understand correctly, this would put any image hoster, including Lemmy, in hot water because 230 immunity is only for civil suits and not federal criminal prosecution.
the text of the bill exempts service providers from any liabilities as long as they make a good faith attempt to remove it as soon as they are aware of its existence. So if someone makes AI generated revenge porn on your instance as long as you take it down when notified, you want be in trouble.
Which part says that?
So the law requires intent and carves out exceptions for service providers that try to remove it.
You can read the whole text here
The lower part just says that overeager removal of depictions does not create liability. Say, onlyfans bans the account of a creator because some face recognition AI thought their porn depicted a celebrity. They have no recourse for lost income.
As to the upper part, I am not sure what "reckless disregard" means in this context. I don't think it means that you only have to act if you happen to receive a complaint. If you see nudes of some non-porn celebrity, then it's mostly likely a fake. It seems reckless not to remove it immediately. What if there are not enough mods to look at each image. Is it reckless to keep operating?
I appreciate your reading into the text. I am not a lawyer so it isn't always clear how to read the legal language crafted into these bills. Since the quoted part of the law is under the criminal penalty section of the bill, I read it as releasing the service provider from criminal liability if they try to stop the distribution of it. I see your point as how you read it and that makes sense to me
Yes, expressions can have meanings that are unclear to non-experts, like reckless disregard. It means specific things in the context of specific laws and I can't guess how it should be interpreted here.
I took some words out to improve readability.
I believe the second one is for, EG, someone making a database of banned material, so that it can be filtered automatically on upload. Or if someone uses those images to train an AI to recognize fakes. For that purpose it will be necessary to "disclose" (IE distribute) the images to the people working on it; perhaps an outside company.
It's not, people know it's a deepfake most of the time and don't claim it's real
It might also be harassment.
If it's not defamation or harassment, then I'm not sure what the problem is. As broad as this is, it looks unconstitutional to me.