They have deleted Tara from their blog, and Trace is listed as "on temporary leave" since October 2025. I don't think anyone who wants policy reform rather than institutional destruction will have work in the USA until 2029.
That part of Trace's response was odd because one of Brennan's themes was "we should have less cults of personality and more peers working together." That seems naive but at least Brennan agrees that cults of personality are bad and Nonlinear Fund needed to be fired into the sun.
In 2024 Ozy Brennan was indignant about Nonlinear Fund, the "incubator of AI-safety meta-charities" which lived as global nomads, hired a live-in personal assistant, asked her to smuggle drugs across borders for them, let a kind-of-colleague take her to bed, then did not pay her regularly and in full.
The correct number of times for the word “yachting” to occur in a description of an effective altruist job is zero. I might make an exception if it’s prefaced with “convincing people to donate to effective charities instead of spending money on.”
Trace popped up in the comments:
Inasmuch as EA follows your preferences, I suspect it will either fail as a subculture or deserve to fail. You present a vision of a subculture with little room for grace or goodwill, a space where everyone is constantly evaluating each other and trying to decide: are you worthy to stand in our presence? Do you belong in our hallowed, select group? Which skeletons are in your closet? Where are your character flaws? What should we know, what should we see, that allows us to exclude you?
Ozy stands with us on this one buddy.
It feels like a teenaged argument about Batman v. Superman or the USS Enterprise v. a Star Destroyer. I think many LessWrongers are not serious about the belief system as something to act on, but the problem is that when they are serious you get Ziz Lasota. Its also similar to how they love markets in theory, but don't want to start a business or make speculative investments.
While I tend to think Yudkowsky is sincere, some things like his prediction market for P(doom) are hard to square with that https://manifold.markets/EliezerYudkowsky/will-ai-wipe-out-humanity-by-2030-r (launched June 2023, will resolve N/A on 1 January 2027 if the world has not ended yet. It has not moved much since 1 January 2024)
An early hint of Gwern's rejection of chaos theory in the sequences from 2008 (the "build God to conquer Death" essay):
And the adults wouldn't be in so much danger. A superintelligence—a mind that could think a trillion thoughts without a misstep—would not be intimidated by a challenge where death is the price of a single failure. The raw universe wouldn't seem so harsh, would be only another problem to be solved.
Someone who got to high-school math or coded a working system would probably have encountered the combinatorial explosion, the impossibility of representing 0.1 as a floating-point binary, Chaos Theory, and so on. Even Games Theory has situations like "in some games, optimal play guarantees a tie but not a win." But Yud was much too special for any of those and refused offers to learn.
The subtext sounds like "we guarantee your returns, then go public. If we go bankrupt you get the retail investors' money, if we become the next Google you get your own private island." All you have to do is trust Sam Altman and (breaks out in hysterical laughter).
Do they mean 17.5% a year? My balanced bond-equity portfolio made 14-15% annual returns over the past three years by the radical method of buying "shares of companies that make profits" and "bonds backed by my local and national government." (Update: I made about 12% a year because I backed out of American stocks years ago, but the blandest 60% stock, 40% bond index fund in my country returned that 14-15% a year after expenses).
Lots of people try "you play the student renting in a bad part of town, and I play the sexy burglar" and a few sign up for a porn shoot where they can be ravaged by skilled, athletic people in a controlled environment. A house party where you sign up to be ravaged by any guest who can catch you is unusual even in fetishland and I don't think many kinksters would recommend combining that with drugs.
An Aella-curious blogger in SoCal has noticed something:
But what I find more interesting than broadly “weird sex” is the specific interest in BDSM, kink and particularly full-contact CNC; a relatively common fantasy in individuals, but one I’ve never seen such widespread community interest in outside the Bay Area.
Kink and power-play are practices of manufactured risk, with CNC clocking at a more intense point on the same spectrum. The idea that many of these people are devoting their 9-5s and beyond to eliminating the ultimate consequence (death), only to go home and collectively play-pretend violence (scaffolded with extensive rules and consent forms) is fascinating, and- to me- makes complete sense.
The rationalist interest in manufacturing risk is the direct byproduct of their commitment to flushing it out.
The blogger attended Aella's SlutCon. I don't know if she knows that many of our friends have problems with consent as most of us understand it (their understanding is more "if they are old enough to sign the contract, and they sign, that is on them").
This was in October 2016, eight years after Epstein was convicted of soliciting sexual services from girls as young as 14. MIRI spent 2014 and 2015 fighting and eventually setting with a former staffer who accused board members of statutory rape. Their legal expenses in those years were around $250k, similar to the money Yud says Epstein offered. So Yudkowsky was very familiar with the concept of older men seeking sex from underage girls and the risks of associating MIRI with it at the time. I don't remember the exact timeline of Brent Dill's Bay Area phase but that would have left Yud very familiar with another case where an older man abused younger women and girls.
The original email thread includes this exchange:
Yudkowsky: "... (Sorry for the delay in answering; I was checking with Nate (Executive Director) to see what we knew about why the fundraiser is going slowly.)"
Epstein: "Were you clearing my name with him"
Yudkowsky: "Not sure what you mean. Nate (Soares) knows you're Jeffrey E. I check not-yet-published info/speculation past him before saying it.")
The phrase "worth MIRI’s while to figure out whether Epstein was an actual bad guy versus random witchhunted guy" sounds like Yud has been listening to Scott Alexander and Scott Aaronson about how rich or educated white men are the real victims and hos be liars. It sounds like he was familiar with the substance of the accusations and thought there was a good chance they were untrue and not the tip of the iceberg.
There is an old principle in software development not to make the GUI too pretty until the back end works, because managers and customers will think its ready when they can click around buttons with nice shading and animations. I think slopware is like that. People see the demo that appears to work and don't see what maintaining it and integrating it with other systems is like.
2007: Robin Hanson blogs about paternalism
August 2025: Someone on a mailing list suggests that the Debian instance with the off-colour jokes from 1980s hacker culture should be sold in:
November 2025: Yudkowsky tweets about an Ill-Advised Consumer Goods Store selling goods such as LSD. The rest of the tweet is about what MiriCult accused him of.
I guess Yud liked that random post?