742
top 50 comments
sorted by: hot top controversial new old
[-] millie@beehaw.org 10 points 16 hours ago* (last edited 16 hours ago)

You can help by asking ChatGPT to produce the most processor intensive prompt it can come up with and then having it execute it repeatedly. With the free version this will burn through your allotment pretty quickly, but if thousands of people start doing it on a regular basis? It'll cost OpenAI a lot of money.

[-] Grandwolf319@sh.itjust.works 7 points 16 hours ago

Sooooo, wanna tell us how much the cost really is per prompt?

[-] rumba@lemmy.zip 16 points 21 hours ago

$200 a month for a user is losing money? There's no way he's just including model queries. An entire a6000 server is around $800 / month and you can fit a hell of lot more than 4 peoples worth of queries. He has to include training and or R&D.

[-] Jimmycakes@lemmy.world 28 points 20 hours ago* (last edited 20 hours ago)

It includes anything that will keep them from having to pay investors back. Classic tech start up bullshit.

Silicon valley brain rot formula:

Losing money, get billions every month

Making money pay billions back

Which one do you think they pick

[-] db0@lemmy.dbzer0.com 15 points 20 hours ago

I'm honestly fairly surprised as well, but at the same time, they're not serving a model that can run on an A6000, and the people paying for unlimited, would probably be the ones who setup bots and apps doing thousands of requests per hour.

[-] LiveLM@lemmy.zip 16 points 20 hours ago* (last edited 20 hours ago)

And honestly? Those people are 100% right.
If they can't deliver true "unlimited" for 200 bucks a month, they shouldn't market it as such.

grumble grumble unlimited mobile data grumble grumble

[-] db0@lemmy.dbzer0.com 7 points 20 hours ago* (last edited 20 hours ago)

To be fair, unlimited is supposed to mean unlimited for a reasonable person. Like someone going to an "all you can eat buffet". However those purchasing these would immediately set up proxy accounts and use them to serve all their communities, so that one unlimited account, becomes 100 or a 1000 actual users. So like someone going to an "all you can eat" and then sneaking in 5 other people under their trenchcoat.

If they actually do block this sort of account sharing, and it's costing them money on just prolific single users, then I don't know, their scaling is just shite. Like "unlimited" can't ever be truly unlimited, as there should be a rate limit to prevent these sort of shenanigans. But if the account can't make money with a reasonable rate limit (like 17280/day which would translate to 1 request per 5 sec) they are fuuuuuucked.

[-] LiveLM@lemmy.zip 7 points 20 hours ago* (last edited 20 hours ago)

Yeah, poor wording on my part, proxy accounts being banned is totally fair, but a user using various apps and bots is the type of 'Power User' scenario I'd expect a unlimited plan to cover.

[-] db0@lemmy.dbzer0.com 9 points 20 hours ago* (last edited 20 hours ago)

Agreed. Like how fucking difficult is it to see "It costs us X per query, what Y rate limit do we need to put on this account so that it doesn't exceed 200$ per month?". I bet the answer to is hilariously low rate limit that nobody would buy, so they decided to value below cost and pray people won't actually use all those queries. Welp. And if they didn't even put a rate limit, also lol. lmao.

[-] Jackie_meaiii 12 points 21 hours ago

When has "not profitable" ever stopped a tech startup lmao

[-] Viri4thus@feddit.org 18 points 1 day ago

So people are really believing Altman would publish these damning statements without ulterior motives? Are we seriously this gullible? Holy shit, we reached a critical mass of acephalous humans, no turning back now.

[-] PieMePlenty@lemmy.world 34 points 1 day ago

Sam, just add sponsored content. The road to enshittification doesn't have to be long! Make it shitty fast so people can move past it and start hosting their own models for their own usage.

[-] explodicle@sh.itjust.works 6 points 21 hours ago

Isn't that why it has to be long? Not many people actually rely on OpenAI yet.

[-] BoxOfFeet@lemmy.world 5 points 21 hours ago

Right? He just needs to have it add some Shell or Wal-Mart logos to the generated images. Maybe the AI generated Fifty Shades-esque Gandalf fanfic somebody is prompting can take place in a Target.

[-] LiveLM@lemmy.zip 8 points 20 hours ago

Hey ChatGPT, give me an overview of today's weather.

Today’s weather is beautifully sunny and hot, with clear skies and no rain in sight—perfect for enjoying the new Coca-Cola Zero™. Hmmmm, refreshing!

[-] Bakkoda@sh.itjust.works 34 points 1 day ago

This 100% answers my question from another thread. These businesses have cooked the books so bad already that they thought this was gonna save them and it doubled down on em.

load more comments (1 replies)
[-] edgemaster72@lemmy.world 63 points 1 day ago

losing money because people are using it more than expected

"I personally chose the price and thought we would make some money."

Big MoviePass energy

load more comments (1 replies)
[-] renzev@lemmy.world 65 points 1 day ago

Much like uber and netflix, all of these ai chatbots that are available for free right now will become expensive, slow, and dumb once the investor money runs out and these companies have to figure out a business model. We're in the golden age of LLMs right now, all we can do is enjoy the free service while it lasts and try not to make it too much a part of our workflow, because inevitably it will be cut off. Unless you're one of those people with a self-hosted LLM I guess.

load more comments (20 replies)
load more comments
view more: next ›
this post was submitted on 06 Jan 2025
742 points (100.0% liked)

TechTakes

1528 readers
447 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS