614
submitted 1 month ago* (last edited 1 month ago) by RmDebArc_5@sh.itjust.works to c/memes@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] ch00f@lemmy.world 21 points 1 month ago

Last night, we tried to use chatGPT to identify a book that my wife remembers from her childhood.

It didn’t find the book, but instead gave us a title for a theoretical book that could be written that would match her description.

[-] dis_honestfamiliar@lemmy.sdf.org 7 points 1 month ago

At least it said if it exists, instead of telling you when it was written (hallucinating)

[-] ch00f@lemmy.world 7 points 1 month ago

Maybe it’s trying to motivate me to become a writer.

[-] leverage@lemdro.id 3 points 1 month ago

Same happens every time I've tried to use it for search. Will be radioactive for this type of thing until someone figures that out. Quite frustrating, if they spent as much time on determining the difference between when a user wants objective information with citations as they do determining if the response breaks content guidelines, we might actually have something useful. Instead, we get AI slop.

this post was submitted on 17 Dec 2024
614 points (100.0% liked)

memes

11299 readers
3043 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS