1517
(page 2) 50 comments
sorted by: hot top controversial new old
[-] umbraroze@slrpnk.net 12 points 3 weeks ago

Quantum computing, probably.

Problem is, it has the potential to be actual reality. Tech bros need their products to be 99% blue-sky hype to get their financing, and they can't risk some nerd going "well actually what you're suggesting can't be done any more efficiently on a quantum computer than you can do now".

load more comments (5 replies)
[-] Snowclone@lemmy.world 11 points 3 weeks ago
load more comments (6 replies)
[-] cronenthal@discuss.tchncs.de 10 points 3 weeks ago

They're trying with "Quantum Computers" and "Humanoid Robots". One promises magic and the other slaves, so you see the appeal for investors.

[-] rational_lib@lemmy.world 10 points 3 weeks ago

Reminds me of Blockchain

According to new research from Deloitte, 74 percent of large companies (with sales over $500 million) see a “compelling business case” for blockchain technology.

Indeed, from supply chain management and regulatory monitoring to recruiting and healthcare, organizations are applying blockchain to their business models to revolutionize how they track and verify transactions.

It's not a fake or fundamentally useless technology, but everyone who doesn't understand it is rushing to figure out how they're gonna claim to use it.

load more comments (15 replies)
[-] IDrawPoorly@lemm.ee 10 points 3 weeks ago

AND the huge AR/metaverse wave!

[-] JackFrostNCola@lemmy.world 7 points 3 weeks ago

Oh yeah that week was crazy

[-] GoodOleAmerika@lemmy.world 9 points 3 weeks ago

AI is here to stay imo. This is not crypto

[-] debil@lemmy.world 11 points 3 weeks ago

Sure but this is about AI hype.

load more comments (2 replies)
[-] Daryl@lemmy.ca 9 points 3 weeks ago

AI is now a catch-all acronym that is becoming meaningless. The old, conventional light switch on the wall of the house I first lived in some 70 years ago could be classified as 'AI. The switch makes a decision, based on what position I put it in. I turn the light on, it remembers that decision and stays on. The thing is, the decision was first made by me and the switch carried out that decision, based on criteria that was designed into it.

That is, AI still does not make any decision that humans have not designed it to make in the first place.

What is needed, is a more appropriate terminology, describing the actual process of what we call AI. And really, the more appropriate descriptor would not be Artificial Intelligence, but Human-made Intelligent devices. All of these so-called AI devices and applications are, after all, completely human designed and human made. The originating Intelligence still comes from the minds of humans.

Most of the applications which we call Artificial Intelligence are actually Algorithmic Intelligence - decisions made based on algorithms designed by humans in the first place. The devices just follow these algorithms. Since humans have written these algorithms, it should really be no surprise that these devices are making decisions very similar to the decisions humans would make. Duhhh. We made them in our own image, no wonder they 'think' like us.

Really, these AI devices do not make decisions, they merely follow the decisions humans first designed into them.

Big Blue, the IBM chess playing computer, plays excellent chess because humans designed it to play chess, and to make chess decisions, based on how humans first designed the chess game.

What would be really scarry would be if Big Blue decided of its own volition that it no longer wanted to play chess, but it wanted to play a game it designed.

[-] Sunsofold@lemmings.world 9 points 3 weeks ago

In this thread: people doing the exact opposite of what they do seemingly everywhere else and ignoring the title to respond to the post.

Figuring out what the next big thing will be is obviously hard or investing would be so easy as to be cheap.

I feel like a lot of what has been exploding has been ideas someone had a long time ago that are just becoming easier and given more PR. 3D printing was invented in the '80s but had to wait for computation and cost reduction. The idea that would become neural network for AI is from the '50s, and was toyed with repeatedly over the years but ultimately the big breakthrough was just that computing became cheap enough to run massive server farms. AR stems back to the 60s and gets trotted out slightly better each generation or so, but it was just tech getting smaller that made it more viable. What other theoretical ideas from the last century could now be done for a much lower price?

load more comments (2 replies)
[-] PattyMcB@lemmy.world 8 points 3 weeks ago

Don't forget "THE CLOUD" and "IoT"

load more comments (2 replies)
[-] some_guy@lemmy.sdf.org 8 points 3 weeks ago

Hey, the blockchain completely revolutionized everything. /s

load more comments (1 replies)
[-] jsomae@lemmy.ml 8 points 2 weeks ago* (last edited 2 weeks ago)

I think they'll be on this for a while, since unlike NFTs this is actually useful tech. (Though not in every field yet, certainly.)

There are going to be some sub-fads related to GPUs and AI that the tech industry will jump on next. All this is speculation:

  • Floating point operations will be replaced by highly-quantized integer math, which is much faster and more efficient, and almost as accurate. There will be some buzzword like "quantization" that will be thrown out to the general public. Recall "blast processing" for the Sega. It will be the downfall of NVIDIA, and for a few months the reduced power consumption will cause AI companies to clamor over being green.
  • (The marketing of) personal AI assistants (to help with everyday tasks, rather than just queries and media generation) will become huge; this scenario predicts 2026 or so.
  • You can bet that tech will find ways to deprive us of ownership over our devices and software; hard drives will get smaller to force users to use the cloud more. (This will have another buzzword.)
[-] WorldsDumbestMan@lemmy.today 8 points 3 weeks ago

Well, all they have to do is teach the AI to do one task decently and consistently, then go on to the next task, until it takes 99% of human jobs, and then they can kill off an increasing amount of humans.

[-] oce@jlai.lu 7 points 3 weeks ago* (last edited 3 weeks ago)

VR and AR will get a second run once the UX is improved and power to run it becomes small and cheap enough. Quantum computing is around the corner.

load more comments (2 replies)
[-] penfore@lemmy.world 7 points 3 weeks ago

I'm waiting for the cheap graphic cards

[-] 0x01@lemmy.ml 6 points 3 weeks ago

Next is quantum computing

load more comments
view more: ‹ prev next ›
this post was submitted on 15 Apr 2025
1517 points (100.0% liked)

memes

14601 readers
2115 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS