518
top 50 comments
sorted by: hot top controversial new old
[-] UnfairUtan@lemmy.world 214 points 1 month ago* (last edited 1 month ago)

https://nmn.gl/blog/ai-illiterate-programmers

Relevant quote

Every time we let AI solve a problem we could’ve solved ourselves, we’re trading long-term understanding for short-term productivity. We’re optimizing for today’s commit at the cost of tomorrow’s ability.

[-] boletus@sh.itjust.works 40 points 1 month ago

Hey that sounds exactly like what the last company I worked at did for every single project 🙃

[-] Daedskin@lemm.ee 33 points 1 month ago* (last edited 1 month ago)

I like the sentiment of the article; however this quote really rubs me the wrong way:

I’m not suggesting we abandon AI tools—that ship has sailed.

Why would that ship have sailed? No one is forcing you to use an LLM. If, as the article supposes, using an LLM is detrimental, and it's possible to start having days where you don't use an LLM, then what's stopping you from increasing the frequency of those days until you're not using an LLM at all?

I personally don't interact with any LLMs, neither at work or at home, and I don't have any issue getting work done. Yeah there was a decently long ramp-up period — maybe about 6 months — when I started on ny current project at work where it was more learning than doing; but now I feel like I know the codebase well enough to approach any problem I come up against. I've even debugged USB driver stuff, and, while it took a lot of research and reading USB specs, I was able to figure it out without any input from an LLM.

Maybe it's just because I've never bought into the hype; I just don't see how people have such a high respect for LLMs. I'm of the opinion that using an LLM has potential only as a truly last resort — and even then will likely not be useful.

[-] Mnemnosyne@sh.itjust.works 14 points 1 month ago

"Every time we use a lever to lift a stone, we're trading long term strength for short term productivity. We're optimizing for today's pyramid at the cost of tomorrow's ability."

[-] julietOscarEcho@sh.itjust.works 12 points 1 month ago

Precisely. If you train by lifting stones you can still use the lever later, but you'll be able to lift even heavier things by using both your new strength AND the leaver's mechanical advantage.

By analogy, if you're using LLMs to do the easy bits in order to spend more time with harder problems fuckin a. But the idea you can just replace actual coding work with copy paste is a shitty one. Again by analogy with rock lifting: now you have noodle arms and can't lift shit if your lever breaks or doesn't fit under a particular rock or whatever.

load more comments (1 replies)
[-] Ebber@lemmings.world 12 points 1 month ago

If you don't understand how a lever works, then it's a problem. Should we let any person with an AI design and operate a nuclear power plant?

load more comments (4 replies)
[-] Guttural@jlai.lu 12 points 1 month ago

This guy's solution to becoming crappier over time is "I'll drink every day, but abstain one day a week".

I'm not convinced that "that ship has sailed" as he puts it.

[-] Hoimo@ani.social 10 points 1 month ago

Not even. Every time someone lets AI run wild on a problem, they're trading all trust I ever had in them for complete garbage that they're not even personally invested enough in to defend it when I criticize their absolute shit code. Don't submit it for review if you haven't reviewed it yourself, Darren.

load more comments (1 replies)
load more comments (3 replies)
[-] kabi@lemm.ee 104 points 1 month ago

If it's the first course where they use Java, then one could easily learn it in 21 hours, with time for a full night's sleep. Unless there's no code completion and you have to write imports by hand. Then, you're fucked.

[-] rockerface@lemm.ee 124 points 1 month ago

If there's no code completion, I can tell you even people who's been doing coding as a job for years aren't going to write it correctly from memory. Because we're not being paid to memorize this shit, we're being paid to solve problems optimally.

[-] spamfajitas@lemmy.world 13 points 1 month ago

My undergrad program had us write Java code by hand for some beginning assignments and exams. The TAs would then type whatever we wrote into Eclipse and see if it ran. They usually graded pretty leniently, though.

load more comments (1 replies)
load more comments (1 replies)
[-] 404@lemmy.zip 32 points 1 month ago

My first programming course (in Java) had a pen and paper exam. Minus points if you missed a bracket. :/

[-] SatanClaus@lemmy.dbzer0.com 10 points 1 month ago

Haha same. God that was such a shit show. My hand writing is terrible lmao

load more comments (2 replies)
load more comments (1 replies)
[-] aliser@lemmy.world 104 points 1 month ago
[-] Agent641@lemmy.world 65 points 1 month ago

Probably promoted to middle management instead

[-] SaharaMaleikuhm@feddit.org 24 points 1 month ago

He might be overqualified

[-] SkunkWorkz@lemmy.world 103 points 1 month ago

Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.

[-] Artyom@lemm.ee 11 points 1 month ago

If we're talking about freshman CS 101, where every assignment is the same year-over-year and it's all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his "explanations", but they're probably tired from their endless stack of work, so why bother?

If we're talking about a 400 level CS class, this kid's screwed and even someone who's mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.

load more comments (15 replies)
[-] TootSweet@lemmy.world 85 points 1 month ago

generate code, memorize how it works, explain it to profs like I know my shit.

ChatGPT was just his magic feather all along.

[-] Bashnagdul@lemmy.world 14 points 1 month ago

Dumbo reference

[-] nednobbins@lemm.ee 81 points 1 month ago

The bullshit is that anon wouldn't be fsked at all.

If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that's called "studying".

[-] MintyAnt@lemmy.world 26 points 1 month ago

Professors hate this one weird trick called "studying"

[-] JustAnotherKay@lemmy.world 14 points 1 month ago

Yeah, if you memorized the code and it's functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it... You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it

load more comments (1 replies)
load more comments (7 replies)
[-] boletus@sh.itjust.works 73 points 1 month ago

Why would you sign up to college to willfully learn nothing

[-] Gutek8134@lemmy.world 44 points 1 month ago* (last edited 1 month ago)

My Java classes at uni:

Here's a piece of code that does nothing. Make it do nothing, but in compliance with this design pattern.

When I say it did nothing, I mean it had literally empty function bodies.

[-] boletus@sh.itjust.works 23 points 1 month ago* (last edited 1 month ago)

Yeah that's object oriented programming and interfaces. It's shit to teach people without a practical example but it's a completely passable way to do OOP in industry, you start by writing interfaces to structure your program and fill in the implementation later.

Now, is it a good practice? Probably not, imo software design is impossible to get right without iteration, but people still use this method... good to understand why it sucks

load more comments (3 replies)
[-] TheSlad@sh.itjust.works 19 points 1 month ago

A lot of kids fresh out of highschool are pressured into going to college right away. Its the societal norm for some fucking reason.

Give these kids a break and let them go when they're really ready. Personally I sat around for a year and a half before I felt like "fuck, this is boring lets go learn something now". If i had gone to college straight from highschool I would've flunked out and just wasted all that money for nothing.

load more comments (3 replies)
[-] SoftestSapphic@lemmy.world 13 points 1 month ago

To get the peice of paper that lets you access a living wage

load more comments (13 replies)
[-] SoftestSapphic@lemmy.world 51 points 1 month ago

This person is LARPing as a CS major on 4chan

It's not possible to write functional code without understanding it, even with ChatGPT's help.

load more comments (5 replies)
[-] Simulation6@sopuli.xyz 51 points 1 month ago

I don't think you can memorize how code works enough to explain it and not learn codding.

[-] FlexibleToast@lemmy.world 24 points 1 month ago

It's super easy to learn how algorithms and what not work without knowing the syntax of a language. I can tell you how a binary search tree works, but I have no clue how to code it in Java because I've never used Java.

[-] TheSlad@sh.itjust.works 16 points 1 month ago

And similarly, i could read code in a language I dont know, understand what it does and how it works even if I don't know the syntax well enough to write it myself

load more comments (1 replies)
load more comments (6 replies)
[-] drmoose@lemmy.world 12 points 1 month ago

I'm a full stack polyglot and tbh I couldn't program in some languages without reference docs / LLM even though I ship production code in those language all the time. Memorizing all of the function and method names and all of the syntax/design pattern stuff is pretty hard especially when it's not really needed in contemporary dev.

load more comments (1 replies)
load more comments (3 replies)
[-] burgersc12@mander.xyz 27 points 1 month ago

Bro just sneak to the bathroom and use chatgpt on your phone like everyone else does

[-] Xanza@lemm.ee 24 points 1 month ago* (last edited 1 month ago)

pay for school

do anything to avoid actually learning

Why tho?

load more comments (7 replies)
[-] xelar@lemmy.ml 18 points 1 month ago* (last edited 1 month ago)

Brainless GPT coding is becoming a new norm on uni.

Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know how does it work?

Not to mention, you won't cheat out your way on recruitment meeting.

[-] Korhaka@sopuli.xyz 16 points 1 month ago

Open the browser in one VM. Open chatgpt in another VM.

[-] WolfLink@sh.itjust.works 16 points 1 month ago

Any competent modern IDE or compiler will help you find syntax mistakes. Knowing the concepts is way more important.

load more comments (4 replies)
[-] 2ugly2live@lemmy.world 15 points 1 month ago

He should be grateful. I hear programming interviews are pretty similar, as in the employer provides the code, and will pretty much watch you work it in some cases. Rather be embarrassed now than interview time. I'm honestly impressed he went the entire time memorizing the code enough to be able to explain it, and picked up nada.

load more comments (2 replies)
[-] Psaldorn@lemmy.world 14 points 1 month ago

Now imagine how it'll feel in interviews

[-] nsrxn@lemmy.dbzer0.com 10 points 1 month ago

run it in a vm

[-] RaoulDook@lemmy.world 10 points 1 month ago

Unless they're being physically watched or had their phone sequestered away, they could just pull it up on a phone browser and type it out into the computer. But if they want to be a programmer they really should learn how to code.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 05 Feb 2025
518 points (100.0% liked)

Greentext

5623 readers
1041 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS