641
you are viewing a single comment's thread
view the rest of the comments
[-] Thorny_Thicket@sopuli.xyz 3 points 1 year ago

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

This is what I'm talking about.

And the issue with that parental control is that say you're gay kid in Iran that send nudes to your boyfriend which Apple then reports to your ultra conservative parents. That's not going to end good for you.

[-] 6xpipe_@lemmy.world 15 points 1 year ago

Apple Kills Its Plan to Scan Your Photos for CSAM

That headline literally says they're not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.

I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn't stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.

[-] Thorny_Thicket@sopuli.xyz 3 points 1 year ago

Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

However I'm not denying that it's probably still the most privacy focused phone you can get. For now.

[-] kirklennon@kbin.social 6 points 1 year ago* (last edited 1 year ago)

They still intended to start scanning your photos and that is worrying.

They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they'd then have a person verify to make absolutely certain, and then finally report your crime.

The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don't deserve it. If you didn't upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It's honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

[-] monad@programming.dev 6 points 1 year ago

Apple proposes change

Users vote against it

Apple doesn’t do change

Nothing to see here folks

[-] Thorny_Thicket@sopuli.xyz 1 points 1 year ago* (last edited 1 year ago)

I don't quite see it like that myself. If you want to potray yourself as a user privacy focused company then why would you even suggest such feature? Even if their intentions are purely to just protect children with zero malicious future plans they still know it's going to have bad optics and be widely controversial.

[-] monad@programming.dev 3 points 1 year ago

they still know it’s going to have bad optics and be widely controversial

How would they know that? It’s often hard to predict how users will react, sometimes your expectations are wrong.

[-] dynamojoe@lemmy.world 5 points 1 year ago

but they only did so because of the massive public outcry

Well, shit. For once the voice of the people worked and you're still bitching about it.

[-] Thorny_Thicket@sopuli.xyz 1 points 1 year ago

You're right. Maybe I'm being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar

[-] murphys_lawyer@lemmy.world 10 points 1 year ago

i mean, that's a pretty niche case and maybe your underage kid shouldn't be sending nudes via imessage anyways.

[-] Thorny_Thicket@sopuli.xyz 1 points 1 year ago

That's a whole another discussion. It just one example anyways. My point still stands; this does not increase user privacy.

[-] Nerdlinger@lemmy.world 10 points 1 year ago

The child in that case is not the user (or at least not the owner). The user is the parent who configures the phone as they choose and loans it to the child. It's no different than Apple allowing a business to configure a MacBook as they choose, including tools to monitor its usage, and then offering that computer to one of their employees. The owner of the device gets to choose the privacy settings, not necessarily the end user.

this post was submitted on 21 Jul 2023
641 points (100.0% liked)

Apple

17274 readers
125 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS