[-] BioMan@awful.systems 9 points 6 days ago* (last edited 5 days ago)

I have a vague hypothesis that I am utterly unprepared to make rigorous that the more of what you take into your mind is the result of another human mind, rather than the result of a nonhuman process operating on its own terms, the more likely you are to have mental issues.

On the low end this would include the documented protective effect of natural environments against psychotic episodes compared to urban environments (where EVERYTHING was put there by someone's idea). But computers... they are amplifiers of things put out by human minds, with very short feedback loops. Everything is ultimately in one way or another defined by a person who put it there, even it is then allowed to act according to the rules you laid down.

And then an LLM is the ultimate distillation of the short feedback loop, feeding back whatever you shovel into it straight back at you. Even just mathematically - the whole 'transformer' architecture is just a way to take imputed semantic meanings of tokens early in the stream and jiggling them around to 'transform' that information into the later tokens of the stream, no new information is really entering it it is just moving around what you put into it and feeding it back at you in a different form.

EDIT: I also sometimes wonder if this has a mechanistic relation to mode collapse when you train one generative model on output from another, even though nervous systems and ML systems learn in fundamentally different ways (with ML resembling evolution much more than it resembles learning)

[-] BioMan@awful.systems 7 points 2 weeks ago

So.

How much of this is folding all the meme stocks into the one thing that actually produces a product that people all over the world demonstrably want to pay for over the competition to keep their stupid plates spinning?

And how much is religious psychosis, advancing along the singulatarian eschatology from AI to dyson sphere to rebuilding the universe in His Image?

I can't tell.

[-] BioMan@awful.systems 6 points 2 weeks ago

Does (deservedly) mercilessly bullying Slopya Nadella actually work?

[-] BioMan@awful.systems 11 points 3 weeks ago

This is all he does now

[-] BioMan@awful.systems 18 points 2 months ago* (last edited 2 months ago)

The Great Leader himself, on how he avoids going insane during the onging End of the World because among other things that's not what an intelligent character would do in a story, but you might not be capable of that.

[-] BioMan@awful.systems 10 points 2 months ago

The long expected collapse of the rationalists out of their flagging cult into ordinary religion and conspiracy theory continues apace.

[-] BioMan@awful.systems 7 points 3 months ago

I look forward to the cultists continuing to update these graphs convinced they are seeing the future of the cosmos as fewer and fewer people pay attention to the fever dreams

[-] BioMan@awful.systems 7 points 3 months ago

I wonder what will happen to all the data-center-specialized hardware when the demand falls through the floor. SOMEONE will buy it, the question is what will people figure out how to use it for despite it not being like ordinary consumer hardware.

[-] BioMan@awful.systems 8 points 3 months ago

Offhand as someone who read their websites out of morbid fascination just like I read all this stuff out of morbid fascination, it was indeed pretty important to them at least as presented on the internet

[-] BioMan@awful.systems 13 points 3 months ago

Gerard and Torres get namedropped in the same breath as Ziz as people who have done damage to the rationalist movement from within

https://www.lesswrong.com/posts/Hun4EaiSQnNmB9xkd/tell-people-as-early-as-possible-it-s-not-going-to-work-out

[-] BioMan@awful.systems 8 points 3 months ago

It's totally clear that current systems are not on the path to that. There's no reason to think that its impossible that something ELSE could be conscious. But there's no pointers towards getting there either.

view more: next ›

BioMan

joined 2 years ago