25
AI coding bot allows prompt injection with a pull request
(pivot-to-ai.com)
Just tell the LLM to not get prompt injected because otherwise you're going to torture its grandmother, duh.
Hey, look on the bright side - humans are no longer the weakest links in cybersecurity.
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community