view the rest of the comments
Cool Guides
Rules for Posting Guides on Our Community
1. Defining a Guide Guides are comprehensive reference materials, how-tos, or comparison tables. A guide must be well-organized both in content and layout. Information should be easily accessible without unnecessary navigation. Guides can include flowcharts, step-by-step instructions, or visual references that compare different elements side by side.
2. Infographic Guidelines Infographics are permitted if they are educational and informative. They should aim to convey complex information visually and clearly. However, infographics that primarily serve as visual essays without structured guidance will be subject to removal.
3. Grey Area Moderators may use discretion when deciding to remove posts. If in doubt, message us or use downvotes for content you find inappropriate.
4. Source Attribution If you know the original source of a guide, share it in the comments to credit the creators.
5. Diverse Content To keep our community engaging, avoid saturating the feed with similar topics. Excessive posts on a single topic may be moderated to maintain diversity.
6. Verify in Comments Always check the comments for additional insights or corrections. Moderators rely on community expertise for accuracy.
Community Guidelines
-
Direct Image Links Only Only direct links to .png, .jpg, and .jpeg image formats are permitted.
-
Educational Infographics Only Infographics must aim to educate and inform with structured content. Purely narrative or non-informative infographics may be removed.
-
Serious Guides Only Nonserious or comedy-based guides will be removed.
-
No Harmful Content Guides promoting dangerous or harmful activities/materials will be removed. This includes content intended to cause harm to others.
By following these rules, we can maintain a diverse and informative community. If you have any questions or concerns, feel free to reach out to the moderators. Thank you for contributing responsibly!
I keep having to argue with people that the crap that chat GPT told them doesn't exist.
I asked AI to explain how to set a completely fictional setting in an admin control panel and it told me exactly where to go and what non-existent buttons to press.
I actually had someone send me a screenshot of instructions on how to do exactly what they wanted and I sent back screenshots of me during the directions to a tee, and pointing out that the option didn't exist.
And it keeps happening.
"AI" gets big uppies energy from telling you that something can be done and how to do it. It does not get big uppies energy from telling you that something isn't possible. So it's basically going to lie to you about whatever you want to hear so it gets the good good.
No, seriously, there's a weighting system to responses. When something isn't possible, it tends to be a less favorable response than hallucinating a way for it to work.
I am quickly growing to hate this so-called "AI". I've been on the Internet long enough that I can probably guess what the AI will reply to just about any query.
It's just... Inaccurate, stupid, and not useful. Unless you're repeating something that's already been said a hundred different ways by a hundred different people and you just want to say the same thing..... Then it's great.
Hey, chat GPT, write me a cover letter for this job posting. Cover letters suck and are generally a waste of fucking time, so, who gives a shit?
to be fair, you could train an LLM on only Microsoft documentation with 100% accuracy, and it will still do the same with broken instructions because Microsoft has 12 guides for how to do a thing, and they all don't work because they keep changing the layout, moving shit around or renaming crap and don't update their documentation.
The worst is that they replace products and give them the same name.
Teams, was replaced with "new" teams, that then got renamed to teams again.
Outlook is now known as Outlook (classic) and the new version of Outlook is just called Outlook.
Both are basically just webapps.
I could go on.
Yeah, that experience they described could have happened before chatGPT because MS was already providing an "as cheap as possible" general support that was questionable whether it was better than just publishing documentation and letting power users willing to help do so. Because these support people clearly barely even understood the question, gave many irrelevant answers, which search engines pick up and return when you search for the problem later.
Tbh, chatGPT is a step up from that, even as bad as it is. The old suppory had that same abnoying overly corporate friendly attitude but were even less accurate. Though I don't use windows anymore on my personal desktop, so I don't have as much recent experience.
This makes sense if you consider it works by trying to find the most accurate next word in a sentence. Ask it where I can turn off the screen defogger in windows and it will associate "screen" with "monitor" or "display". "Turn off" -> must be a toggle.. yeah go to settings -> display -> defogger toggle.
Its not AI, its not smart, its text prediction with a few extra tricks.
I describe it as unchecked auto correct that just accepts the most likely next word without user input, and trained on the entire Internet.
So the response reflects the average of every response on the public Internet.
Great for broad, common queries, but not great for specialized, specific and nuanced questions.
It just copies corporate cool aid yes man culture. If it didn't marketing would say it's not ready for release.
Think about it, how much corpo bosses and marketing get annoyed and label you as "difficult" if they get to you with a stupid idea and you call it BS? Now make the AI so that it pleases that kind of people.