1522
What do you think will the tech bros jump on next?
(feddit.org)
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads/AI Slop
No advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.
A collection of some classic Lemmy memes for your enjoyment
Alibaba's QwQ 32B is already incredible, and runnable on 16GB GPUs! Honestly it’s a bigger deal than Deepseek R1, and many open models before that were too, they just didn’t get the finance media attention DS got. And they are releasing a new series this month.
Microsoft just released a 2B bitnet model, today! And that’s their paltry underfunded research division, not the one training “usable” models: https://huggingface.co/microsoft/bitnet-b1.58-2B-4T
Local, efficient ML is coming. That’s why Altman and everyone are lying through their teeth: scaling up infinitely is not the way forward. It never was.