321
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 28 Jan 2025
321 points (100.0% liked)
Technology
61300 readers
3606 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
I wish that was true, but this doesn't threaten any monopoly
~~It certainly does.~~
~~Until last week, you absolutely NEEDED an NVidia GPU equipped with CUDA to run all AI models.~~
~~Today, that is simply not true. (watch the video at the end of this comment)~~
~~I watched this video and my initial reaction to this news was validated and then some: this video made me even more bearish on NVDA.~~
Edit: corrected and redacted.
mate, that means they are using PTX directly. If anything, they are more dependent to NVIDIA and the CUDA platform than anyone else.
to simplify: they are bypassing the CUDA API, not the NVIDIA instruction set architecture and not CUDA as a platform.
Ahh. Thanks for this insight.
also not true
Thanks for the corrections.