457
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 31 Jan 2026
457 points (100.0% liked)
Technology
81907 readers
2718 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
doesn't even have to be the site owner poisoning the tool instructions (though that's a fun-in-a-terrifying-way thought)
any money says they're vulnerable to prompt injection in the comments and posts of the site
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
Lmao already people making their agents try this on the site. Of course what could have been a somewhat interesting experiment devolves into idiots getting their bots to shill ads/prompt injections for their shitty startups almost immediately.
I am a little curious about how effective a traditional chain mail would be on it.
Good god, I didn't even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.
They also have a 'skill' sharing page (a skill is just a text document with instructions) and depending on config, the bot can search for and 'install' new skills on its own. and agyone can upload a skill. So supply chain attacks are an option, too.