14
Roko: the Animatrix demonstrates why we must destroy the chip fabs
(www.lesswrong.com)
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
I feel like this scenario depends on a lot of assumptions about the processing speed and energy/resource usage of AIs. A trillion is a big number. Notably there's currently only about 0.8% this number of humans, who are much more energy efficient than AIs.
During WWII everyone computed on slide rules which had zero transistors. Then they invented the transistor which had one transistor. Then they started making mainframes and Ataris and C64s which had like what, hundred or thousand? Then they invented computers and Windows and PS1 that had maybe a million transistors. And then we got dual core CPUs which had double the transistors per transistor. Then they invented GPUs which is like a thousand tiny CPUs in one CPU. Then they made i7 which probably has like a billion transistors and Ryzen which has ten billion and RTX4090 Ti has 79 billion. Now they say China is going to make a phone with trillion transistors.
That's called exponential growth and assuming perfectly spherical frictionless nanometers it will go on forever. In eight to twelve years schoolchildren will be running GPT6 on their calculators. We will build our houses entirely out of graphics cards. AI will figure out cold fusion any week now and Dennard scaling will never hit its limit, right?
A trillion transistors on our phones? Can't wait to feel the improved call quality and reliability of my video conferencing!
now you're in pure fantasy land.
Yes. So? This has, will, always be the case. Uncertainty is the only certainty.
When these assholes say things, the implication is always that the future world looks like everything you care about being fucked, you existing in an imprisoned state of stasis, so you better give us control here and now.
Nobody knows and it's impossible for anyone to know so let's all just assume I'm right.
I do love this compulsion of rationalists to use Big Numbers as if they were sufficient arguments.
but what if the number was really really big 🥺
my mind always goes back to the sci-fi classics