125
you are viewing a single comment's thread
view the rest of the comments
[-] 200fifty@awful.systems 11 points 3 months ago* (last edited 3 months ago)

The bill mandates safety testing of advanced AI models and the imposition of “guardrails” to ensure they can’t slip out of the control of their developers or users and can’t be employed to create “biological, chemical, and nuclear weapons, as well as weapons with cyber-offensive capabilities.” It’s been endorsed by some AI developers but condemned by others who assert that its constraints will drive AI developers out of California.

Man, if I can't even build homemade nuclear weapons, what CAN I do? That's it, I'm moving to Nevada!

this post was submitted on 07 Sep 2024
125 points (100.0% liked)

TechTakes

1485 readers
129 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS