"Drinking alone tonight?" the bartender asks.
From page 17:
Rather than encouraging critical thinking, in core EA the injunction to take unusual ideas seriously means taking one very specific set of unusual ideas seriously, and then providing increasingly convoluted philosophical justifications for why those particular ideas matter most.
ding ding ding
Senior year of college, I took an elective seminar on interactive fiction. For the final project, one of my classmates wrote a program that scraped a LiveJournal and converted it into a text adventure game.
Scott Computers is married and a father but still writes like an incel and fundamentally can't believe that anyone interested in computer science or physics might think in a different way than he does. Dilbert Scott is an incredibly divorced man. Scott Adderall is the leader of the beige tribe.
They're reeeaallly leaning into the fact that some of the math involved is also used in statistical physics. And, OK, we could have an academic debate about how the boundaries of fields are drawn and the extent to which the divisions between them are cultural conventions. But the more important thing is that the Nobel Prize is a bad institution.
I trained a neural network on all the ways I've said that I hate these people, and it screamed in eldritch spectra before collapsing into silence.
... "Coming of Age" also, oddly, describes another form of novel cognitive dissonance; encountering people who did not think Eliezer was the most intelligent person they had ever met, and then, more shocking yet, personally encountering people who seemed possibly more intelligent than himself.
The latter link is to "Competent Elities", a.k.a., "Yud fails to recognize that cocaine is a helluva drug".
I've met Jurvetson a few times. After the first I texted a friend: “Every other time I’ve met a VC I walked away thinking ‘Wow, I and all my friends are smarter than you.’ This time it was ‘Wow, you are smarter than me and all my friends.’“
Uh-huh.
Quick, to the Bat-Wikipedia:
On November 13, 2017, Jurvetson stepped down from his role at DFJ Venture Capital in addition to taking leave from the boards of SpaceX and Tesla following an internal DFJ investigation into allegations of sexual harassment.
Not smart enough to keep his dick in his pants, apparently.
Then, from 2006 to 2009, in what can be interpreted as an attempt to discover how his younger self made such a terrible mistake, and to avoid doing so again, Eliezer writes the 600,000 words of his Sequences, by blogging “almost daily, on the subjects of epistemology, language, cognitive biases, decision-making, quantum mechanics, metaethics, and artificial intelligence”
Or, in short, cult shit.
Between his Sequences and his Harry Potter fanfic, come 2015, Eliezer had promulgated his personal framework of rational thought — which was, as he put it, “about forming true beliefs and making decisions that help you win” — with extraordinary success. All the pieces seemed in place to foster a cohort of bright people who would overcome their unconscious biases, adjust their mindsets to consistently distinguish truth from falseness, and become effective thinkers who could build a better world ... and maybe save it from the scourge of runaway AI.
Which is why what happened next, explored in tomorrow’s chapter — the demons, the cults, the hells, the suicides — was, and is, so shocking.
Or not. See above, RE: cult shit.
Organized crime is not a rejection of Americanism, it’s what we fear Americanism to be. It’s our nightmare of the American system. When “Americanism” was a form of cheerful bland official optimism, the gangster used to be destroyed at the end of the movie and our feelings resolved. Now the mood of the whole country has darkened, guiltily; nothing is resolved at the end of “The Godfather,” because the family business goes on.
Wow, it's nice that that doesn't feel at all relevant in this, the year 2024
As an AI language model, I'd like to point everyone to Max Tegmark's appearances in the old!sneerclub archives.
Artificial Intelligence, noun: a technology which, at great expense, finally allows computers to be bad at math
Vacant, glassy-eyed, plastic-skinned, stamped with a smiley face... "optimistic"
I mean, if the smiley were aligned properly, it would be a poster for a horror story about enforced happiness and mandatory beauty standards. (E.g., "Number 12 Looks Just Like You" from the famously subtle Twilight Zone.) With the smiley as it is, it's just incompetent.