211
ChatGPT offered bomb recipes and hacking tips during safety tests
(www.theguardian.com)
This is a most excellent place for technology news and articles.
And how would you know it’s correct. There’s like a high chance that that was not the correct recipe or missing crucial info
I have synthesized it before when I was a teenager, I already knew the chemical procedure, I just wanted to see if ChatGPT would give me an accurate proc with a little poking. I also deliberately gave it incorrect steps (like keeping the mixture above a crucial temperature that can cause runaway decomp and it warned against that, so it wasn't just reflecting my prompts.