That's the thing. CSAM filtering can be useful when images are posted to public websites. But scanning private files, no matter what the scan is for, is a major privacy concern.
13
CSAM is one thing, like pictures? But how can they detect “attempts to groom children”? If someone says on Mastodon that they’re sad at school, and if I commented saying something nice trying to help them feel better, would that be a potential grooming? AI will read & analyze every private message like that? And for better AI “predictions”, every user is required to verify their age, sex, sexual orientation, hobbies, etc?
Btw Happy Birthday GNU! 🎂
So they were able to shut down the whole world because of the flu but they cannot catch and arrest child abusers and need to scan all private communications, sounds legit.
this post was submitted on 26 Sep 2023
13 points (100.0% liked)
privacy
2 readers
1 users here now
Rules (WIP)
- No ad hominem allowed
- Attack the idea, not the poster
founded 2 years ago
MODERATORS