1517
you are viewing a single comment's thread
view the rest of the comments
[-] CompactFlax@discuss.tchncs.de 100 points 3 weeks ago* (last edited 3 weeks ago)

ChatGPT loses money on every query their premium subscribers submit. They lose money when people use copilot, which they resell to Microsoft. And it’s not like they’re going to make it up on volume - heavy users are significantly more costly.

This isn’t unique to ChatGPT.

Yes, it has its uses; no, it cannot continue in the way it has so far. Is it worth more than $200/month to you? Microsoft is tearing up datacenter deals. I don’t know what the future is, but this ain’t it.

ETA I think that management gets the most benefit, by far, and that’s why there’s so much talk about it. I recently needed to lead a meeting and spent some time building the deck with a LLM; took me 20 min to do something otherwise would have taken over an hour. When that is your job alongside responding to emails, it’s easy to see the draw. Of course, many of these people are in Bullshit Jobs.

[-] brucethemoose@lemmy.world 47 points 3 weeks ago

OpenAI is massively inefficient, and Atlman is a straight up con artist.

The future is more power efficient, smaller models hopefully running on your own device, especially if stuff like bitnet pans out.

[-] CompactFlax@discuss.tchncs.de 9 points 3 weeks ago

Entirely agree with that. Except to add that so is Dario Amodei.

I think it’s got potential, but the cost and the accuracy are two pieces that need to be addressed. DeepSeek is headed in the right direction, only because they didn’t have the insane dollars that Microsoft and Google throw at OpenAI and Anthropic respectively.

Even with massive efficiency gains, though, the hardware market is going to do well if we’re all running local models!

[-] brucethemoose@lemmy.world 8 points 3 weeks ago

Alibaba's QwQ 32B is already incredible, and runnable on 16GB GPUs! Honestly it’s a bigger deal than Deepseek R1, and many open models before that were too, they just didn’t get the finance media attention DS got. And they are releasing a new series this month.

Microsoft just released a 2B bitnet model, today! And that’s their paltry underfunded research division, not the one training “usable” models: https://huggingface.co/microsoft/bitnet-b1.58-2B-4T

Local, efficient ML is coming. That’s why Altman and everyone are lying through their teeth: scaling up infinitely is not the way forward. It never was.

[-] deegeese@sopuli.xyz 13 points 3 weeks ago

I fucking hate AI, but an AI coding assistant that is basically a glorified StackOverflow search engine is actually worth more than $200/month to me professionally.

I don’t use it to do my work, I use it to speed up the research part of my work.

I do think there will have to be some cutting back, but it provides capitalists with the ability to discipline labor and absolve themselves (I would never do such a thing, it was the AI what did it!) which might they might consider worth the expense.

[-] anomnom@sh.itjust.works 6 points 3 weeks ago

Might be cheaper than CEO fall guys, now that anti-die is stopping them from using “first woman CEOs” with their lower pay as the scapegoats.

[-] Bytemeister@lemmy.world 6 points 3 weeks ago

That's the business model these days. ChatGPT, and other AI companies are following the disrupt (or enshittification) business model.

  1. Acquire capital/investors to bankroll your project.
  2. Operate at a loss while undercutting your competition.
  3. Once you are the only company left standing, hike prices and cut services.
  4. Ridiculous profit.
  5. When your customers can no longer deal with the shit service and high prices, take the money, fold the company, and leave the investors holding the bag.

Now you've got a shit-ton of your own capital, so start over at step 1, and just add an extra step where you transfer the risk/liability to new investors over time.

[-] LaLuzDelSol@lemmy.world 5 points 3 weeks ago

Right, but most of their expenditures are not in the queries themselves but in model training. I think capital for training will dry up in coming years but people will keep running queries on the existing models, with more and more emphasis on efficiency. I hate AI overall but it does have its uses.

[-] CompactFlax@discuss.tchncs.de 3 points 3 weeks ago

No, that’s the thing. There’s still significant expenditure to simply respond to a query. It’s not like Facebook where it costs $1 million to build it and $0.10/month for every additional user. It’s $1billion to build and $1 per query. There’s no recouping the cost at scale like previous tech innovation. The more use it gets, the more it costs to run, in a straight line, not asymptotically.

[-] LaLuzDelSol@lemmy.world 8 points 3 weeks ago

No way is it $1 per query. Hell a lot of these models you can run on your own computer, with no cost apart from a few cents of electricity (plus datacenter upkeep)

[-] umbrella@lemmy.ml 1 points 3 weeks ago

are you telling me i can spam these shitty services to lose them money?

[-] kameecoding@lemmy.world 1 points 3 weeks ago

Companies will just in house some models and train it on their own data, making it both more efficient and more relevant to their domain.

this post was submitted on 15 Apr 2025
1517 points (100.0% liked)

memes

14601 readers
2303 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS