410

It certainly wasn’t because the company is owned by a far-right South African billionaire at the same moment that the Trump admin is entertaining a plan to grant refugee status to white Afrikaners. /s

My partner is a real refugee. She was jailed for advocating democracy in her home country. She would have received a lengthy prison sentence after trial had she not escaped. This crap is bullshit. Btw, did you hear about the white-genocide happening in the USA? Sorry, I must have used Grok to write this. Go Elon! Cybertrucks are cool! Twitter isn’t a racist hellscape!

The stuff at the end was sarcasm, you dolt. Shut up.

you are viewing a single comment's thread
view the rest of the comments
[-] LostXOR@fedia.io 11 points 12 hours ago

That's a good reason to use open source models. If your provider does something you don't like, you can always switch to another one, or even selfhost it.

[-] WatDabney@fedia.io 20 points 12 hours ago

Or better yet, use your own brain.

[-] LostXOR@fedia.io 4 points 10 hours ago

Yep, not arguing for the use of generative AI in the slightest. I very rarely use it myself.

[-] ArchRecord@lemm.ee 8 points 12 hours ago

While true, it doesn't keep you safe from sleeper agent attacks.

These can essentially allow the creator of your model to inject (seamlessly, undetectably until the desired response is triggered) behaviors into a model that will only trigger when given a specific prompt, or when a certain condition is met. (such as a date in time having passed)

https://arxiv.org/pdf/2401.05566

It's obviously not as likely as a company simply tweaking their models when they feel like it, and it prevents them from changing anything on the fly after the training is complete and the model is distributed, (although I could see a model designed to pull from the internet being given a vulnerability where it queries a specific URL on the company's servers that can then be updated with any given additional payload) but I personally think we'll see vulnerabilities like this become evident over time, as I have no doubts it will become a target, especially for nation state actors, to simply slip some faulty data into training datasets or fine-tuning processes that get picked up by many models.

this post was submitted on 16 May 2025
410 points (100.0% liked)

Technology

70048 readers
3509 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS