335

A new study from Columbia Journalism Review showed that AI search engines and chatbots, such as OpenAI's ChatGPT Search, Perplexity, Deepseek Search, Microsoft Copilot, Grok and Google's Gemini, are just wrong, way too often.

you are viewing a single comment's thread
view the rest of the comments
[-] HubertManne@moist.catsweat.com 2 points 1 day ago

yeah my thought was like how often is a web search "right". To me ai search is just another version of search. its like at first searching gave you a list of urls, then it gave you a list of urls with human type names as well to make it more clear what it was, then it started giving back little summaries to give an idea of what each page was saying. now it gives you a summary of many pages. My main complaint is these things should be required to giver references and their answers should pretty much look like a wikipedia page but the little drop down carats or roll overs (although I prefer a drop down myself).

[-] wjrii@lemmy.world 5 points 22 hours ago

It never was, but unlike the current batch of LLM assistants that are now dominating the tops of "search" results, it never claimed to be. It was more, "here's what triggered our algorithm as "relevant." Figure out your life, human."

Now, instead, you have a paragraph of natural text that will literally tell you all about cities that don't exist and confidently assert that bestiality is celebrated in Washington DC because someone wrote popular werewolf slash fanfic set in Washington state. Teach the LLMs some fucking equivocation and this problem is immediately reduced, but then it makes it obvious that these things aren't Majel Barrett in Star Trek and they've been pushed out much too quickly.

[-] OhVenus_Baby@lemmy.ml 2 points 21 hours ago

What I have seen like duckAI in duckduckgo search it cites references and if you query GPT and other models with the specific data your looking towards it will cite and give links to where it sourced the input from like PubMed. Etc.

For instance I will query with something like give me a list of flowers that are purple, cite all sources and ensure accuracy of data provided by cross referencing with other studies while using previous chats as context.

I find it's about how you type your queries and logic. Once you understand how the models work rather than blindly accepting them as supreme AI then you understand it's limits and how to utilize the tool for what they are.

[-] HubertManne@moist.catsweat.com 5 points 21 hours ago

I really feel it should not be necessary to ask them to site all sources though. It should be default behavior.

this post was submitted on 11 Mar 2025
335 points (100.0% liked)

Technology

66067 readers
5407 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS