13
reddit searches
(lemmy.one)
Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Of course, no problem! The best example of its usefulness is that I can open a PDF in edge and ask bing to use the PDF to answer my questions. I often get documentation PDF's that are 2000 - 5000 pages long and searching requires me to still read separate chapters to put the info I need together.
Another example: I work in industrial automation, searching for information by googling is a pain and very rarely gives me good results unless I really put a lot of time going through forum threads and manufacturer's links that most of the time just ask you to contact support. Bing chat has been a life saver by finding answers from these forums and sometimes manufacturer's pages which are several layers deep on their webpage and don't always show up in google search.
Interesting. I've only dabbled with ChatGPT to check out its creative writing skills, and then later where it failed with any mathematical questions.
Do you feel comfortable/confident in trusting the responses you get from it?
I usually double check, especially with critical cases. With online searches the most useful part of bing chat has been the references which it adds to the answer which I usually go check out (but the summary is still nice from bing). It's quite clear when it doesn't know the answer, but often the references still help pointing me towards resources!
Overall it is not doing my job for me, but it speeds up the process significantly. Definitely made me feel like my job is still secure because the questions need to be very specific and still require double checking.
Edit: I've been so curious about ChatGPT. Do you notice a significant difference between Bing and ChatGPT itself? I did notice bing does sometimes atleast admit that it does not know. I heard ChatGPT just boldy gives an answer even if it doesn't know it.
I have almost zero experience with bing chat, but I hear good things.
When I used ChatGPT for maths solutions it absolutely chose the "confidently incorrect" style. Even stating things that made no sense or are easy mistakes.. "as 7 is the largest prime", for example.
Uh oh! Hmm.. well Bing chat does have three settings for "answer accuracy": creative, balanced and strict. Don't know if ChatGPT has something similar? I usually leave it on balanced or strict and haven't noticed anything weird, but I haven't really tested its limits either. Give it a go!