[-] greasewizard@piefed.social 12 points 1 week ago

You can at least sue a doctor for malpractice if they make a mistake. If you follow medical advice from a chatbot and you die, who is liable?

Large Language Models were built to rewrite emails, not provide valid medical advice

greasewizard

joined 1 month ago