641
top 50 comments
sorted by: hot top controversial new old
[-] reallynotnick@lemmy.world 186 points 1 year ago
[-] mysoulishome@lemmy.world 74 points 1 year ago

Yep. They really doubled down on privacy/security and it’s pretty admirable. The President doesn’t use an android or a blackberry for a reason. (Well, two in the case of blackberry. Security and existing). If only there were no other problematic areas of Apple’s business (manufacturing, wages, environmental impact).

[-] Areopagus@lemmy.world 35 points 1 year ago

Can't wait for them to put their money where their mouth is and do the same in China and other large population countries that demand the same thing 😂

[-] reev@sh.itjust.works 9 points 1 year ago

They use WeChat anyway.

[-] Thorny_Thicket@sopuli.xyz 9 points 1 year ago

They're hypocrites though. Branding themselves as privacy focused and in some cases actually being that too but at the same time also scanning your photos and messages and reporting to authorities/parents if there something inappropriate.

Inb4 no need to worry if you have nothing to hide -argument

[-] mysoulishome@lemmy.world 9 points 1 year ago

Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.

[-] Thorny_Thicket@sopuli.xyz 3 points 1 year ago

https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

This is what I'm talking about.

And the issue with that parental control is that say you're gay kid in Iran that send nudes to your boyfriend which Apple then reports to your ultra conservative parents. That's not going to end good for you.

[-] 6xpipe_@lemmy.world 15 points 1 year ago

Apple Kills Its Plan to Scan Your Photos for CSAM

That headline literally says they're not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.

I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn't stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.

[-] Thorny_Thicket@sopuli.xyz 3 points 1 year ago

Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

However I'm not denying that it's probably still the most privacy focused phone you can get. For now.

[-] monad@programming.dev 6 points 1 year ago

Apple proposes change

Users vote against it

Apple doesn’t do change

Nothing to see here folks

load more comments (2 replies)
[-] kirklennon@kbin.social 6 points 1 year ago* (last edited 1 year ago)

They still intended to start scanning your photos and that is worrying.

They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they'd then have a person verify to make absolutely certain, and then finally report your crime.

The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don't deserve it. If you didn't upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It's honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

[-] dynamojoe@lemmy.world 5 points 1 year ago

but they only did so because of the massive public outcry

Well, shit. For once the voice of the people worked and you're still bitching about it.

[-] Thorny_Thicket@sopuli.xyz 1 points 1 year ago

You're right. Maybe I'm being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar

[-] murphys_lawyer@lemmy.world 10 points 1 year ago

i mean, that's a pretty niche case and maybe your underage kid shouldn't be sending nudes via imessage anyways.

load more comments (2 replies)
[-] damnYouSun@sh.itjust.works 6 points 1 year ago

Well that and the fact that he's 900 years old and probably thinks all phones are iPhones.

[-] Ghostalmedia@lemmy.world 67 points 1 year ago* (last edited 1 year ago)

This is one thing Apple has been pretty firm on. You can’t have a secure product and have backdoors. You can try to hide them all you want, but a backdoor will always be a massive security vulnerability.

[-] SGG@lemmy.world 24 points 1 year ago

Will, except in China. They opened the backdoor nice and wide for Winnie the Pooh so he could gobble up all the Chinese iCloud data

[-] jmanes@lemmy.world 43 points 1 year ago

Good on them for standing up for what's right on this.

[-] GatoB@lemmy.world 14 points 1 year ago
[-] whofearsthenight@lemmy.world 9 points 1 year ago

Not really. Apple's track record for this kind of thing is pretty great. See also, the San Bernardino case.

[-] GatoB@lemmy.world 3 points 1 year ago
[-] tabular@lemmy.world 8 points 1 year ago* (last edited 1 year ago)

Apple doesn't like be told what to do.

If privacy is in the way of their desires then Apple will invade their user's privacy. They don't stand for privacy.

[-] cufta22@programming.dev 4 points 1 year ago

Only apple is allowed to spy on it's users

[-] jmanes@lemmy.world 3 points 1 year ago

Seems like you're spewing FUD to me, mostly. I agree Apple is far from perfect, but they literally introduced an e2e methodology for much of iCloud data recently.

Besides, even if they are only doing this out of selfish desire, it's still a good thing for the consumers in this case.

load more comments (5 replies)
[-] CowsLookLikeMaps@sh.itjust.works 27 points 1 year ago

21st century govenrments: Hey guys, why don't we ban math?

Yeah good luck with that. Gotta give it to Apple on this one, though I'm not a huge fan of their business practices otherwise.

[-] reddig33@lemmy.world 10 points 1 year ago

I don’t doubt it. Apple would probably just ship a new app called “Texts” or something that only does traditional cell carrier text messages, and then refer customers to third party solutions for video conferencing. A nice explanatory web page on Apple’s website to point customers in the region towards would be the cherry on top.

[-] B0rax@feddit.de 7 points 1 year ago

No need for a new app. The app is already called „messages“. Just remove iMessage support and it works „fine“

[-] Pixlbabble@lemmy.world 9 points 1 year ago

Can we get iMessage on Android in the States tho?

[-] Steeve@lemmy.ca 21 points 1 year ago

We have iMessage at home (it's Signal and nobody else uses it)

[-] Nima@lemmy.world 7 points 1 year ago

they removed the ability to send sms in the app. so I, like many others, moved on to other apps that could handle such a task.

signal isn't capable of anything other than talking to other signal users. so it's a dead app.

load more comments (5 replies)
[-] JiveTurkey@lemmy.world 7 points 1 year ago

Hope this comes to fruition. Maybe it would help people realize how dumb it is to be locked into these services in the first place.

[-] laminam@lemmy.world 5 points 1 year ago
[-] damnYouSun@sh.itjust.works 5 points 1 year ago

We will have to wait and see what if they actually follow through.

They are big ones for making grand statements and then quietly backtracking later on once all the press isn't paying attention anymore.

[-] tabular@lemmy.world 3 points 1 year ago

I'd say good on Apple protecting their users but knowing Apple it's more like they don't want to be told what to do.

load more comments
view more: next ›
this post was submitted on 21 Jul 2023
641 points (100.0% liked)

Apple

17440 readers
32 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS