203
top 50 comments
sorted by: hot top controversial new old
[-] FenrirIII@lemmy.world 33 points 1 day ago

Is that why all the executives and directors at my giant tech company are pushing AI? Fuckwits...

[-] friend_of_satan@lemmy.world 49 points 1 day ago* (last edited 1 day ago)

This does not sound good for those people. Writing is a way of thinking. AI writing assistants are competitive cognitive artifacts. People who use AI to write most of their written communication will get worse at thinking through writing.

[-] trolololol@lemmy.world 6 points 15 hours ago

Hey this that you're doing is called gate keeping.

We got multiple versions of these every time a new tech comes along.

People defending typewriters. Or learning Latin. Or something better than a quill and jar of ink. Or paper being affordable.

Just. Stop.

[-] TORFdot0@lemmy.world 7 points 2 hours ago

There is published research that using AI makes people worse at critical thinking. It’s not gatekeeping, it’s a legitimate concern.

[-] pebbles@sh.itjust.works 2 points 1 hour ago

I mean, books did make us worse at memorizing. I think its give and take. There are some things that are good to cognitively offload to an AI.

[-] TORFdot0@lemmy.world 1 points 55 minutes ago

I do agree that there are tasks that are good to offload to AI. I don’t believe that reading and writing should be. AI can be a great tool. Ironically, since you mentioned memorization, I can’t possibly retain 100% the information I’ve learned in career and so using LLMs to point to the correct documentation or to create some boilerplate has greatly improved my productivity.

I’ve used AI as a conversational tool to assist in finding legitimate information to answer search queries (not just accept its output at face value) and generating boilerplate code (and not just using it as another stack overflow and copying and paste the code it gives you without understanding). The challenge is that if we try to replace 100% of the task of communication or research or coding, you eventually lose those skills. And I worry for Jrs who are just building those skills but have totally relied on AI to do the work that’s supposed to teach them those skills.

[-] echodot@feddit.uk 19 points 14 hours ago

You seriously need to look up gatekeeping because that's not what it means at all.

Also you are making stuff up. No one has ever been against learning Latin, it is always being seen as something that a sophisticated gentleman knows, literally the opposite of whatever random nonsense you're claiming right now.

[-] Lemmist@lemm.ee 6 points 1 day ago

Most people don't need to think, they need to write. And AI helps them in that.

[-] TORFdot0@lemmy.world 1 points 2 hours ago

If they can’t think or write on their own then what is their value? Why not just go straight to the LLM and cut out the middle man?

[-] Geodad@lemm.ee 1 points 3 hours ago

Those people who don’t want to think need to be doing manual labor that doesn’t require thought.

[-] Lemmist@lemm.ee 3 points 3 hours ago

They prefer lawmaking.

[-] echodot@feddit.uk 16 points 14 hours ago

Most people don't need to think

No they just don't do it. The world would be in a much better position if people engaged their brains occasionally.

[-] jrs100000@lemmy.world 114 points 1 day ago

People bad at math use calculators. People with bad handwriting prefer to type. Weak people use levers. Slow people rely more on wheels. Its like were a bunch of tool using primates or something.

[-] stickly@lemmy.world 20 points 14 hours ago* (last edited 14 hours ago)

In all of those examples, the user knows exactly what they want and the tool is a way to expedite or enable getting there. This isn't quite the same thing.

If we were talking a tool like augmented audio to text I'd agree. I'd probably even agree if it was an AI-proofreader style model where you feed it what you have to make sure it's generally comprehensible.

Writing as a skill is about solidifying and conveying thoughts so they can be understood. The fact that it turns into text is kind of irrelevant. Hand waving that process is just rubber stamping something you kinda-sorta started the process of maybe thinking about.

[-] jrs100000@lemmy.world 6 points 14 hours ago

I'm not really sure what you mean. They are not perfect, and in fact it will usually reduce the quality of output for a skilled writer, but half of the adults in the US cant read and write at a sixth grade level, and LLMs are greatly improving their ability to solidify and convey their thoughts in a more understandable way.

[-] stickly@lemmy.world 9 points 13 hours ago* (last edited 12 hours ago)

LLMs work by extrapolation, they can't output any better than the context you give them. They're used in completely inappropriate situations because they're dead easy and give very digestible content.

Your brain is the only thing in the universe that knows the context of what you're writing and why. At a sixth grade level, you could technically describe almost anything but it would be clunky and hard to read. But you don't need an LLM to fix that.

We've had tools for years that help with the technical details of writing (basic grammar, punctuation, and spelling). There are also already tools to help with phrasing and specifying a concept ("hey Google, define [X]" or "what's the word for when...").

This is more time consuming than an LLM, but guarantees that what you write is exactly what you intend to communicate. As a bonus, your reading comprehension gets better. You might remember that definition of [X] when you read it.

If you have access to those tools but can't/won't use them then you'll never be able to effectively write. There's no magic substitute for literacy.

[-] jrs100000@lemmy.world 5 points 12 hours ago

An AI can produce content that is higher quality than the prompts they are given, particularly for formulaic tasks. I do agree that it would be nice if everyone were more educated, but a large portion of the population will never get there. If simply denying them AI was going to result in a blossoming of self education it would have already happened by now.

[-] stickly@lemmy.world 5 points 12 hours ago* (last edited 12 hours ago)

It can't ever accurately convey any more information than you give it, it just guesses details to fill in. If you're doing something formulaic, then it guesses fairly accurately. But if you tell it "write a book report on Romeo and Juliet", it can only fill in generic details on what people generally say about the play; it sounds genuine but can't extract your thoughts.

Not to get too deep into the politics of it but there's no reason most people couldn't get there if we invested in their core education. People just work with what they're given, it's not a personal failure if they weren't taught these skills or don't have access to ways to improve them.

And not everyone has to be hyper-literate, if daily life can be navigated at a 6th grade level that's perfectly fine. Getting there isn't an insurmountable task, especially if you flex those cognitive muscles more. The main issue is that current AI doesn't improve these skills, it atrophies them.

It doesn't push back or use logical reasoning or seek context. Its specifically made to be quick and easy, the same as fast food. We'll be having intellectual equivalent of the diabetes epidemic if it gets widespread use.

[-] jrs100000@lemmy.world 3 points 11 hours ago

It sounds like you are talking about use in education then, which is a different issue altogether.

You can and should set your AI to push back against poor reasoning and unsupported claims. They arnt very smart, but they will try.

[-] stickly@lemmy.world 2 points 11 hours ago* (last edited 10 hours ago)

I mean it's the same use; it's all literacy. It's about how much you depend on it and don't use your own brain. It might be for a mindless email today, but in 20 years the next generation can't read the news without running it through an LLM. They have no choice but to accept whatever it says because they never develop the skills to challenge it, kind of like simplifying things for a toddler.

The models can never be totally fixed, the underlying technology isn't built for that. It doesn't have "knowledge" or "reasoning" at all. It approximates it by weighing your input against a model of how those words connect together and choosing a slightly random extension of them. Depending on the initial conditions, it might even give you a different answer for each run.

[-] jrs100000@lemmy.world 3 points 10 hours ago

Is that any worse than people getting their world view from a talking head on 24 news, five second video clips on their phone, or a self curated selection of rage bait propaganda online? The mental decline of humanity is perpetual and overstated.

[-] stickly@lemmy.world 2 points 7 hours ago

One bad thing doesn't make a different but also bad thing ok. And in my opinion it is worse, imagine if their world view could only come from 5 second videos. Throw those history books away.

And I don't know that it's overstated and it's not at all perpetual. Look at... everything these days. People "disagree" with fundamental facts and are blindly allowing our planet to be burnt to the ground.

It takes concentrated effort to build and maintain an educated populace. The wide availability of books and increased literacy directly caused the Renaissance, pulling down the status quo and giving us access to modern medicine and literally every right + luxury you enjoy today.

[-] echodot@feddit.uk 4 points 14 hours ago* (last edited 13 hours ago)

Because they're not actually using the AI that way, to support them in their writing endeavors, they're just having the AI do the writing task for them.

A calculated doesn't do the understanding for you it just does the calculation. You still need to understand what it is you're asking the calculator to do. If you want to calculate compound interest you still need to understand the concepts behind compound interest, in order to be able to put the right calculations into the calculator.

[-] jrs100000@lemmy.world 4 points 13 hours ago

I dont really think its fair to expect the barely literate to have writing endeavours. They are just trying to communicate without embarrassing themselves.

[-] singletona@lemmy.world 71 points 1 day ago

'researchers surprised people that don't know how to do a thing cheat to use half baked tools to do the thing for them.'

[-] muhyb@programming.dev 14 points 1 day ago

I'm surprised that researchers are surprised at all.

[-] Empricorn@feddit.nl 44 points 1 day ago* (last edited 1 day ago)

Even before the AI fad, services like Grammarly were surprising to me. So, you're marketing to non-readers, and people who want to sound better in written communication... without learning to write better... Huh. My current employment has very little formal writing as part of it, yet I still think learning how to effectively communicate is absolutely vital for any job, or at least for getting a better one...

[-] EngineerGaming@feddit.nl 7 points 1 day ago

I have also seen a video discussing that Grammarly often makes mistakes because it doesn't understand context and nuance as much as a human would.

[-] 3x7x37@lemmy.dbzer0.com 5 points 1 day ago

Grammerly is a key logger, I'd look into alternatives.

load more comments (9 replies)
[-] Petter1@lemm.ee 8 points 1 day ago

I always use AI to write texts.

I am to fucking lazy to write more than keywords 😆.

I let it format into a proper text and tell it what it should adjust. That is one task AI is very good in (way better than myself).

For me, it is the faster approach, but I always tend to write with enormous information density (which is disliked by many people somehow) anyway.

I personally prefer the shortest wording with most information to read, so I sometimes let AI summarise.

[-] grrgyle@slrpnk.net 7 points 1 day ago

I'm not an AI fan, but thank you for using it remove words, rather than turn 20 words into 200.

[-] Petter1@lemm.ee 6 points 1 day ago* (last edited 1 day ago)

Reading such articles made me vomit even prior AI 🤣 newspaper writers just love expanding 3 sentences to, like, 5 paragraphs.

Edit: Paragraphs, not Absätze, lol

[-] Tywele@lemmy.dbzer0.com 3 points 1 day ago

5 Absätze

"Paragraphs" is the English word you were probably looking for 😅

[-] Petter1@lemm.ee 2 points 1 day ago
[-] Ulrich@feddit.org 2 points 1 day ago

AI doesn't really "summarize" though, it just chooses random topics to filter out.

[-] shield_87@lemmy.eco.br 2 points 1 day ago

i'm honestly curious about your writing style. maybe you could develop it or refine it! but yeah i don't judge you for using ai

[-] Petter1@lemm.ee 2 points 1 day ago

Of course I could, but I don’t want to 😆🤪

[-] shield_87@lemmy.eco.br 2 points 16 hours ago
load more comments
view more: next ›
this post was submitted on 03 Mar 2025
203 points (100.0% liked)

Technology

63897 readers
4430 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS