1280
submitted 5 months ago by ElCanut@jlai.lu to c/programmerhumor@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] eestileib@sh.itjust.works 29 points 5 months ago

LLM system input is unsanitizable, according to NVidia:

The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.

https://developer.nvidia.com/blog/securing-llm-systems-against-prompt-injection/

[-] MalReynolds@slrpnk.net 2 points 5 months ago

Everything old is new again (GIGO)

this post was submitted on 07 Jun 2024
1280 points (100.0% liked)

Programmer Humor

32555 readers
467 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS