285
Language Is a Poor Heuristic for Intelligence
(karawynn.substack.com)
This is a most excellent place for technology news and articles.
The term AI has been rebranded multiple times, but the latest usage is definitely a marketing term used to boost investor sentiment.
Investors are falling for it too. It's funny because I think they were finally coming to realize the broken promises of untold wealth to be gained via tech's early offerings, and starting to realize that their business model is pretty limited (surveillance capitalism with the end goal of ad tech... Which is why Google is willing to end the open web to stop adblockers), and then conveniently "GenAI" is elevated to prominence as what we call "AI" and everyone loses their minds again with fantasy instead of asking how this thing could possibly make any money.
Will I get the free version of this thing to reword "fuck you" in a passive aggressive, more corporate friendly way so I can send off an email to the company? Sure. Will I subscribe to a service that provides that capability? Fuck no.
I do have to say though that I think a lot of programmers appear to have a type of writer's block that I just don't understand, and perhaps the only money generating method for these things other than continuing to push ads may be their purchase as corporate tools. I think the integrations, though faddy and obvious, will be the first things to sail out the door when someone gets the bill, because the APIs often bill via token and these piles of garbage require quite a bit of token to be exchanged to come to any kind of satisfactory result.
EDIT: I wanted to add two things:
Yes, now that it has become a marketable product, the term "AI" feels like a buzzword due to overuse. But in actual fact it is still being used (by most vendors) consistent with how it has been used for like 40 years. ML, video game opponents, chess engines: all of these have been referred to as "AI" for at least that long. Anyone who thinks that calling GPT or Stable Diffusion "AI" started "five minutes ago" (or even that it is in any way novel) has to be someone whose only exposure to the concept of "AI" has been through Sci-Fi movies, and not the actual, real field of AI that has been developing for decades. It is, therefore, a very clear signal that the person knows fuck all about the subject and so cannot possibly form a valid opinion. It's just a generic angry response.
And frankly, I think ChatGPT would do a better job of that than the author of the article. At least we wouldn't be wasting actual, valuable, human brain resources making this drivel.