598
(page 2) 50 comments
sorted by: hot top controversial new old
[-] yuki2501@lemmy.world 12 points 1 year ago

Good riddance.

[-] banneryear1868@lemmy.world 12 points 1 year ago* (last edited 1 year ago)

Of course it will, all these companies are funded by tech giants and venture capitalist firms. They don't make money they cost money.

[-] NGC2346@sh.itjust.works 10 points 1 year ago

Its fine, i got my own LlaMa at home, it does almost the same as GPT

[-] RoyalEngineering@lemmy.world 9 points 1 year ago

Which one? What are your system specs?

I’ve been thinking about doing this too.

load more comments (1 replies)
load more comments (4 replies)
[-] mishimaenjoyer@kbin.social 10 points 1 year ago
[-] MindSkipperBro12@lemmy.world 6 points 1 year ago
[-] kitonthenet@kbin.social 5 points 1 year ago

Why shouldn’t I root for the destruction of a technology who’s proponents openly gloat will be used to my detriment?

[-] Widowmaker_Best_Girl@lemmy.world 9 points 1 year ago* (last edited 1 year ago)

Well, I was happily paying them to lewd up the chatbots, but then they emailed me telling me to stop. I guess they don't want my money.

load more comments (5 replies)
[-] kitonthenet@kbin.social 8 points 1 year ago

That’s a lot of crypto coins to sell

[-] stu@lemmy.pit.ninja 7 points 1 year ago

If ChatGPT only costs $700k to run per day and they have a $10b war-chest, assuming there were no other overhead/development costs, OpenAI could run ChatGPT for 39 years. I'm not saying the premise of the article is flawed, but seeing as those are the only 2 relevant data points that they presented in this (honestly poorly written) article, I'm more than a little dubious.

But, as a thought experiment, let's say there's some truth to the claim that they're burning through their stack of money in just one year. If things get too dire, Microsoft will just buy 51% or more of OpenAI (they're going to be at 49% anyway after the $10b deal), take controlling interest, and figure out a way to make it profitable.

What's most likely going to happen is OpenAI is going to continue finding ways to cut costs like caching common query responses for free users (and possibly even entire conversations, assuming they get some common follow-up responses). They'll likely iterate on their infrastructure and cut costs for running new queries. Then they'll charge enough for their APIs to start making a lot of money. Needless to say, I do not see OpenAI going bankrupt next year. I think they're going to be profitable within 5-10 years. Microsoft is not dumb and they will not let OpenAI fail.

load more comments (1 replies)
[-] donuts@kbin.social 6 points 1 year ago

They're gonna be in even bigger trouble when it's determined that AI training, especially for content generation, is not fair use and they have to pay each and every person whose data they've used.

load more comments (7 replies)
[-] MaxPow3r11@lemmy.world 6 points 1 year ago
[-] mycroftholmess@lemm.ee 6 points 1 year ago

It's definitely become a part of a lot of people's workflows. I don't think OpenAI can die. But the need of the hour is to find a way to improve efficiency multifold. This will make it cheaper, more powerful and more accessible

[-] Saganastic@kbin.social 5 points 1 year ago

I think they're just trying to get people hooked, and then they'll start charging for it. It even says at the bottom of the page when you're in a chat:

Free Research Preview. ChatGPT may produce inaccurate information about people, places, or facts. ChatGPT August 3 Version

load more comments (2 replies)
[-] kitonthenet@kbin.social 5 points 1 year ago

At a $250mm/yr burn rate and a revenue of… a lot less than that, they can die pretty quickly

load more comments (2 replies)
[-] tonytins@pawb.social 4 points 1 year ago

The thing about all GPT models is that they’re based on the frequency of the word to determine its usage. Which means the only way to get good results is if it's running on cutting edge equipment designed specifically for that job, while being almost a TB in size. Meanwhile, Diffusion models are only GB and run on the GPU but still produce masterpieces because they already know what that word is associated with.

load more comments
view more: ‹ prev next ›
this post was submitted on 13 Aug 2023
598 points (100.0% liked)

Technology

59276 readers
3796 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS