1092
Clever, clever
(mander.xyz)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
LLMs can cite. It's called Retrival-Augmented Generation. Basically LLM that can do Information Retrival, which is just academic term for search engines.
You can just print retrival logs into references. Well, kinda stretching definition of "just".
My question is that the thing they are citing actually exists and if it does exist, contains the information it claims.
In case of RAGs it exists in searched dataset.
Not guaranteed.
Depends. In my experience, it usually does exist. Now there are hallucinations where GPT makes up stuff or just misinterprets what it read. But it's super easy to read the GPT output, look at the cited work, skim works for relevance, then tweak the wording and citing to match.
If you just copy/paste and take GPT's word for it without the minimal amount of checking, you're digging your own grave.