There's a very long history of extremely effective labor saving tools in software.
Writing in C rather than Assembly, especially for more than 1 platform.
Standard libraries. Unix itself. More recently, developing games in Unity or Unreal instead of rolling your own engine.
And what happened when any of these tools come on the scene is that there is a mad gold rush to develop products that weren't feasible before. Not layoffs, not "we don't need to hire junior developers any more".
Rank and file vibe coders seem to perceive Claude Code (for some reason, mostly just Claude Code) as something akin to the advantage of using C rather than Assembly. They are legit excited to code new things they couldn't code before.
Boiling the rivers to give them an occasional morale boost with "You are absolutely right!" is completely fucked up and I dread the day I'll have to deal with AI-contaminated codebases, but apart from that, they have something positive going for them, at least in this brief moment. They seem to be sincerely enthusiastic. I almost don't want to shit on their parade.
The AI enthusiast bigwigs on the other hand, are firing people, closing projects, talking about not hiring juniors any more, and got the media to report on it as AI layoffs. They just gleefully go on about how being 30% more productive means they can fire a bunch of people.
The standard answer is that they hate having employees. But they always hated having employees. And there were always labor saving technologies.
So I have a thesis here, or a synthesis perhaps.
The bigwigs who tout AI (while acknowledging that it needs humans for now) don't see AI as ultimately useful, in the way in which C compiler was useful. Even if its useful in some context, they still don't. They don't believe it can be useful. They see it as more powerfully useless. Each new version is meant to be a bit more like AM or (clearly AM-inspired, but more familiar) GLaDOS, that will get rid of all the employees once and for all.
Very interesting! I didn't realize there was this historical division between fortran and c. I thought c was just "better" because it came later.
Oh, not at all. It would be very rude of me to describe C as a pathogen transmitted through the vector of Unix, so I won't, even if it's mostly accurate to say so.
Many high level systems programming languages predate C, like the aforementioned Fortran, Pascal, PL/I and the ALGOL family. The main advantage C had over them in the early 1970s was its relatively light implementation. The older, bigger languages were generally considered superior to C for actual practical use on systems that could implement them, i.e. not a tiny cute little PDP-7.
Since then C has grown some more features and a horrible standard filled to the brim with lawyerly weasel words that let compilers optimize code in strange and terrifying ways, allowing it to exists as something of a lingua franca of systems programming, but at the time of its birth C wouldn't have been seen as anything particularly revolutionary.
Posting cause im afraid people will miss this. This is what a pdp-7 looked like: https://en.wikipedia.org/wiki/PDP-7
@Soyweiser @bitofhope Is that a built-in oscilloscope in the center section?
So cool ... but WHY
It's used as a display that does 1024x1024 bit raster graphics.
https://vintagetek.org/pdp7/