21
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Aug 2025
21 points (100.0% liked)
TechTakes
2312 readers
156 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
This does leave out the constant cost (per video generated) of training the model itself right. Which pro genAI people would say you only have to do once, but we know everything online gets scraped repeatedly now so there will be constant retraining. (I am mixing video with text here so, lot of big unknowns).
If they got a lot of usage out of a model this constant cost would contribute little to the cost of each model in the long run... but considering they currently replace/retrain models every 6 months to 1 year, yeah this cost should be factored in as well.
Also, training compute grows quadratically with model size, because its is a multiple of training data (which grows linearly with model size) and the model size.