303
Adobe (lemy.lol)
top 21 comments
sorted by: hot top controversial new old
[-] Thrashy@lemmy.world 67 points 8 months ago

Oof... At work we deal with clients whose projects are covered by NDAs and confidentiality agreements, among other things. This is bad enough if the information scanned is siloed per organization, as it could create a situation where somebody not under NDA could access confidential client info leaked by an LLM that ingested every PDF in Adobe's cloud service without regard to distribution. Even worse if they're feeding everything back into a single global LLM -- corporate espionage becomes as simple as a bit of prompt engineering!

[-] BluesF@lemmy.world 21 points 8 months ago

I highly doubt that they would be able to use private user data for training. Using data available on the internet is a bit legally grey, but using data that is not publicly available would surely be illegal. When the document is "read" by the LLM it is no longer training, so it won't store the data and be able to regurgitate it.*

* that is, if they have designed this in an ethical and legal way 🙃

[-] Monument@lemmy.sdf.org 35 points 8 months ago

They will use every scrap of data you haven’t explicitly told them not to use, and they will make it so that the method to disable these ‘features’ is little known, difficult to understand/access, and automatically re-enabled every release cycle. When they are sued, they will point to announcements like this and the one or two paragraphs in their huge EULA to discourage, dismiss, and slow down lawsuits.

[-] lud@lemm.ee 7 points 8 months ago

I suspect that they will explicitly advertise that they won't be using any data for training. Just like Microsoft Copilot enterprise (or whatever it's called) and Bing chat enterprise.

Companies absolutely know the risk with these systems and will never allow or buy a system that scans and saves their data.

[-] Monument@lemmy.sdf.org 6 points 8 months ago

I had a second part of my comment that I left off because I felt like I was hitting the point too hard, but…

I have firsthand knowledge of an organization that’s a GCC tenant. That’s the government cloud, and in mid-2022 Microsoft rolled out a product called Microsoft Viva without first consulting with platform admins. They just pushed it out into M365, activated and enabled. A personalized automated email was sent out to every person within the org, from Microsoft.com, with snippets of emails deemed to be follow up items by “Cortana” - which platform admins had disabled on every computer within the org. It was pretty clear that Microsoft had exfiltrated government data, analyzed it, and then sent emails to users regarding their analysis.
Platform admins did find a way to disable it within a few days, and leadership sent out an email characterizing the episode as a misconfigured, early release feature to assuage concerns. They promised to get to the bottom of it with Microsoft, and nothing was ever heard about it again.

Then earlier this year - multiple pushes of consumer apps and features which are not released on the GCC roadmap. Automatic install of New Teams, which - thankfully, displays a message that the user isn’t licensed for it, but that creates IT tickets because it auto-launches and disables classic teams from auto-launching. Lots of user confusion there. New Outlook, which didn’t support data classification, multiple mailboxes, and many of the features that make Outlook useful. It’s been a huge boondoggle as users have enabled new Outlook, and then don’t know how to switch back to a working version of Outlook. Recently everyone’s PowerBI began failing to launch, because Microsoft rolled out a OneDrive/SharePoint integration without testing it. Same with HP Print manager.

My point in all that is not just to have a laundry list of Microsoft failures. I have a list for Adobe, too, but it’s to establish that updates are not vetted, and often just pushed into the wrong update channels.
When pressed, it’s always a ‘configuration error’ or an accidental early release. A bug or what-have-you.

The line from annoying to dangerous is going to be quickly crossed once these companies start training AI on the harvested PII and government data they’ve procured through the sloppy deployment practices they’re already engaged in.
I guarantee you that rogue hackers and nation states alike are working on fuzzing every AI dataset they can, to see if it picked up anything juicy. Once Adobe gets their hands on everyone’s scanned health record, classified documents, and credit card application, we’re going to see an endless stream of ‘whoopsies.’

[-] Lmaydev@programming.dev 3 points 8 months ago

All the ones I've seen that are aimed at companies have explicit terms that protect your data and don't allow it to be shared anywhere.

[-] Monument@lemmy.sdf.org 3 points 8 months ago* (last edited 8 months ago)

But that’s just like, a suggestion, man.

And it’s kind of predicated on their admins being highly proactive about data protection, because the vendors certainly aren’t.

[-] restingboredface@sh.itjust.works 10 points 8 months ago
  • that is, if they have designed this in an ethical and legal way 🙃

Thus is adobe we're talking about...

[-] LWD@lemm.ee 4 points 8 months ago

A corporation that charges a monthly subscription for products it could sell outright. Offers it to students at a time when they are most likely to develop habits in it, uses a proprietary storage format that only works well with their products.

Once you get a customer addicted, you've got them for life.

[-] Lodespawn@aussie.zone 41 points 8 months ago

Does the AI include a feature that converts the bloated, non-functional hulk of an application that is Adobe Acrobat into a usable, fit-for-purpose PDF viewer/writer/editor with a consistent interface? Oo I really hope it does, that would be really helpful.

[-] TheFriar@lemm.ee 16 points 8 months ago

Check out SumatraPDF. When I started a job with a ton of random PDF paperwork to fill out, I needed to find something to use. It’s awesome. And free.

[-] sugar_in_your_tea@sh.itjust.works 5 points 8 months ago

If you just need to fill out forms, you can just use Firefox (and probably Chrome).

[-] Lodespawn@aussie.zone 2 points 8 months ago

I need to do drawing markups. Bluebeam does a good job, my current company refuses to get it and insists that Acrobat Pro is functional. I feel like thats something that someone who never has to use Acrobat Pro has decided.

[-] sugar_in_your_tea@sh.itjust.works 3 points 8 months ago

Wow, that's stupid. I just looked it up and it costs a few hundred per year, which is probably way less than you waste using a bad tool. If I was your manager, I'd get it for you.

[-] pkill@programming.dev 1 points 8 months ago

zathura or evince ftw

[-] aeronmelon@lemmy.world 31 points 8 months ago

"To learn more about the capabilities, and whether or not we'll allow you to disable them,..."

[-] uvok@pawb.social 18 points 8 months ago

Paperless-ngx is an awesome way to self host your documents.

[-] mindbleach@sh.itjust.works 18 points 8 months ago* (last edited 8 months ago)

EULAs are intolerable. The entire concept is invalid because of blatant abuse like this.

[-] DigitalTraveler42@lemmy.world 16 points 8 months ago

Brian Krebs, one of the most well known security bloggers:

https://krebsonsecurity.com/

[-] SoupBrick@yiffit.net 12 points 8 months ago

Ya know, AI has really pushed the Cyber Crime field years into the future! Adobe made an excellent decision adding it to their suite of technology used by businesses around the world!

[-] andrew_bidlaw@sh.itjust.works 7 points 8 months ago

It's almost like they don't have enough money already.

this post was submitted on 20 Feb 2024
303 points (100.0% liked)

People Twitter

5173 readers
1761 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS