1319
submitted 6 months ago by boem@lemmy.world to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] Diplomjodler3@lemmy.world 343 points 6 months ago

It's not AI that is the problem, it's half baked insecure data harvesting products pushed by big corporations that are the problem.

[-] DarkThoughts@fedia.io 148 points 6 months ago

The biggest joke is that the LLM in Windows is running locally, it uses your hardware and not some big external server farm. But you can bet your ass that they still use it to data harvest the shit out of you.

[-] Saik0Shinigami@lemmy.saik0.com 135 points 6 months ago

To me this is even worse though. They're using your electricity and CPU cycles to grab the data they want which lowers their bandwidth bills.

It happening "locally" while still sending all the metadata home is just a slap in the face.

[-] NutWrench@lemmy.world 58 points 6 months ago

Also, CoPilot is going to be bundled with Office 365, a subscription service. You're literally paying them to spy on you.

[-] Andromxda@lemmy.dbzer0.com 28 points 6 months ago* (last edited 6 months ago)
  • be microsoft, a whole bunch of greedy user-hostile fucks
  • make spyware
  • tell users that spyware is really cool and useful
  • make them pay for the spyware
  • use the spyware to get their data
  • sell their data
  • profit
[-] Plopp@lemmy.world 16 points 6 months ago
[-] Andromxda@lemmy.dbzer0.com 14 points 6 months ago
  • create a monopoly on operating systems
  • leverage your dominant market position and force everyone to use your spyware
  • big profit
[-] DarkThoughts@fedia.io 16 points 6 months ago

Exactly. And if I use or even pay for an external LLM service then that's also my decision. But they force this scheme onto every user, whether they want it or not. It's like the worst out of all possible scenarios.

[-] Bishma@discuss.tchncs.de 38 points 6 months ago

That's a pretty big joke, but I think the bigger joke is calling LLMs AI. We taught linear algebra to talk real pretty and now corps want to use it to completely subsume our lives.

[-] grue@lemmy.world 14 points 6 months ago

I think the bigger joke is calling LLMs AI

I have to disagree.

Frankly, LLMs (which are based on neural networks) seem a Hell of a lot closer to how actual brains work than "classical AI" (which basically boils down to a gigantic pile of if statements) does.

I guess I could agree that LLMs are undeserving of the term "AI", but only in the sense that nothing we've made so far is deserving of it.

[-] Brickardo@feddit.nl 1 points 6 months ago

Let's agree to disagree then. An LLM has no notion of semantics, it's just outputting the most likely word to follow up to what it's already written and the user's input.

On the contrary, expert systems from back in the 90s for, say, predicting the atomic structure of an element, work like a human brain on steroids. It features an arbitrary large search tree that the software knows how to iterarively prune according to a well known set of chemical rules. We do the same when analyzing a set of options.

Debugging "current" AI models, on the other hand, is impossible because all we're doing is prescripting a composition of functions and forcing it to minimize a loss function. That's all we're doing. How can you currently tell that a certain model is going to work? Unless the mathematical theory ever catches up with the technology, we'll never know until we execute the code.

[-] DarkThoughts@fedia.io 9 points 6 months ago

Oh I agree. I typically put "AI" in quotation marks when using that term regarding LLMs, because to me they simply are not intelligent in anyway. In my mind an AI would need an actual level of consciousness of sorts, the ability to form actual thoughts and learn things freely based on whatever senses it has. But AI is a term that's good for marketing as well as fear mongering, which we see a lot of in current news cycles and on social media. The problem is that most people do not even understand the basic principles of how LLMs work, which lead to a lot of misconceptions about its uses & misuses and what we should do about it. Weirdly enough this makes LLMs both completely overhyped as a product and completely stigmatized as some nefarious tool as well. But I guess it fits into our today's societies that kinda seem to have lost all nuance and reason.

[-] snooggums@midwest.social 83 points 6 months ago

That is an accurate description of AI in common usage even if it isn't an inherent aspect of AI.

[-] andrew@lemmy.stuart.fun 12 points 6 months ago

Right, but AI is not the only way they're doing the data collection.

[-] pennomi@lemmy.world 21 points 6 months ago

Locally run AI could be great. But sending all your data to an external server for processing is really, really bad.

[-] dustyData@lemmy.world 10 points 6 months ago

You wrote AI twice.

this post was submitted on 03 Jun 2024
1319 points (100.0% liked)

Technology

59974 readers
1964 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS