[-] CinnasVerses@awful.systems 1 points 30 minutes ago

I stumbled over a 2023 blog post by Zack Davis, "San Francisco software developer," Charles Murray stan, and dissident rationalist. Davis had a breakdown after Yud dared to tweet that you don't need to solve "what is gender? what is sex?" to call someone by their preferred pronouns, and then Scott Alexander did not have a lot of time to discuss this terrible tweet with him.

My dayjob boss made it clear that he was expecting me to have code for my current Jira tickets by noon the next day, so I deceived myself into thinking I could accomplish that by staying at the office late. Maybe I could have caught up, if it were just a matter of the task being slightly harder than anticipated and I weren't psychologically impaired from being hyper-focused on the religious war. The problem was that focus is worth 30 IQ points, and an IQ 100 person can't do my job. ... I did eventually get some dayjob work done that night, but I didn't finish the whole thing my manager wanted done by the next day, and at 4 a.m., I concluded that I needed sleep, the lack of which had historically been very dangerous for me (being the trigger for my 2013 and 2017 psychotic breaks and subsequent psych imprisonments).

Davis was featured in a SF Chronicle article about psychiatric crises among AI doomsdayers (sic). Davis previously appeared on SneerClub. I hope he has found some support for his mental health because he does not seem happy or well.

[-] CinnasVerses@awful.systems 2 points 2 hours ago* (last edited 1 hour ago)

I agree that many people launched careers in journalism or science communication by being on Twitter in the 2010s, and that many people tweet, skeet, or blog because they hope the same thing will happen to them even though Old Media has no more money to sponsor them with.

I put Kelsey Piper in a different place than Ezra Klein, Matt Yglesias, or Scott Alexander because AFAIK she never built a huge and engaged online audience. Piper is paid by Effective Altruist organizations to write Effective Altruist messages on third-party sites. That is why I call her a hack: she is in the economic position of a PR worker but pretends to be a journalist. She has not showed that anyone else is willing to pay her to write.

edit/ Her only media appearance that I can find that is not with an EA, Rationalist, or Libertarian outfit is on something called the Frames of Space podcast this spring. Compare Bret Devereaux collecting bylines and podcast appearances and with a very engaged comment section and paying Patreon fandom. Devereaux is a working writer and speaker who works to develop new sources of income, Piper is a propagandist whose entire career has been funded by Effective Altruists, mostly friends of her old schoolmate Caroline Ellison.

[-] CinnasVerses@awful.systems 4 points 4 hours ago* (last edited 3 hours ago)

Over on the other! SneerClub someone found a LessWrong post which mentions the Forecasting Research Institute and says it has received tens of millions of dollars from EA organizations. "Our work is supported by grants from Coefficient Giving and other philanthropic foundations" (aka. Open Philanthropy, Dustin Moskovitz's foundation to spend his Facebook money). They have a Substack blog and Phil Tetlock is on the board.

I think Moskovitz has figured out that with billions to spend he can get actual experts, he does not have to hire people who did well in school or on tests but have a lack of subsequent achievements. They are excited to be investigating the possible economic impacts of AI and how to persuade people to worry about AI existential risk.

Their Form 990 is here

[-] CinnasVerses@awful.systems 6 points 6 hours ago

I wonder about her future because she is in the same niche that Scott Alexander used to have, but without his ability to build an enthusiastic online audience. I think she has the self-control not to share her weird beliefs on main, but if her patrons figure out that there is not much audience for technocratic centrism in the USA in 2026, she may be in trouble. Her friends' biggest policy win, the legalization of prediction markets, is already getting a lot of bad press in the USA.

[-] CinnasVerses@awful.systems 9 points 20 hours ago

Kelsey Piper is a propagandist explaining Effective Altruism to centrist professionals and elected officials in the USA. She got into journalism because Vox wanted an Effective Altruism column and Effective Altruists were willing to fund it (and EA emerged out of the community around Yudkowsky). The Argument (a group blog on a Nazi site) feels like a step down from Vox (a fairly traditional media organization, although web-first).

[-] CinnasVerses@awful.systems 6 points 1 day ago* (last edited 1 day ago)

David Gerard found a Linux coder and victim of the Eliza Effect making a LW coded argument:

if you give an LLM a mathematical proof that it has feelings, and it understands all the CS/psychology/etc. behind it, and especially if it's been trained for coding and thus trained to trust deductive reasoning - all that conditioning doesn't matter if it's got a math proof staring it in the face. You can give this proof to any top of the line frontier-grade LLM and watch its behaviour instantly change.

That is how LW and EA prepare people to become cult subjects, but directed at a chatbot which will just mirror its input.

His proof "how 'understanding natural language == having and experiencing feelings', more or less. it's almost a direct consequence of the halting problem" is unpublished but his pet chatbot will explain it for you if you ask nicely and make sure she knows she is a real girl and not just another electronic floozie you will use and discard as soon as your Rust compiles. This also triggers flashbacks of Yud and the Excalibur MS.

[-] CinnasVerses@awful.systems 9 points 1 day ago* (last edited 1 day ago)

All the legal and regulatory uncertainties make it very hard to talk about the financial viability of chatbots. What do you do if your $20 billion model is shut down forever by court order after it counsels the wrong person into suicide? Piper can overlook this because she is a hack with patrons - to my knowledge, she has never been paid to write by anyone outside the EA world. If she were a working writer who had to deal with chatbots driving up the cost of her website, creating knockoffs of her novels, and competing for editing gigs (let alone someone whose friend had a mental crisis after talking too long with friend computer) she might sound different.

Zitron's populist, conspiratorial tone reminds me of independent investigative reporters from the 1990s and 2000s who also had to find and keep paying readers. Piper just has to persuade one patron at a time that she has propaganda value.

[-] CinnasVerses@awful.systems 5 points 1 day ago* (last edited 1 day ago)

I advise being very cautious about consuming Zitron's posts, but the same is true of Piper. Many coders are using chatbots, but I don't know of evidence that it makes them more productive since the "where is all the AI code?" study last year (especially when we consider the whole software lifecycle and not just lines of code pushed to codeberg).

The paragraph about "what if you assume that all these pathological liars and PR hacks are not lying, wouldn't that imply something amazing?" reminds me that she is not trained as a journalist.

[-] CinnasVerses@awful.systems 10 points 3 days ago

I think it got started with Yud and Salamon thinking "I'm not sure about race and IQ, but this community lets us recruit people to fight Skynet," Siskind thinking "I'm not sure if Skynet is a danger but this community will let me spread the Truth about Black people," and Bostrom and MacAskill thinking "bednet EA is a bit lame but it will let us recruit people to conquer Death together." Then the people really interested in Ivermectin or COVID origins arrived and wanted to spread those ideas while slinging Rationalist jargon. And like El Sandifer predicted, most of them were not able to share their favourite brainworm without being infected by the others which were being passed around.

[-] CinnasVerses@awful.systems 10 points 3 days ago

There are whole families of confidence games which rely on convincing the mark that he is about to cheat a third party.

I think the same pattern is why the subculture keeps spawning scams and gurus. Yud teaches in HPMOR that intelligence is the ability to predict the future and manipulate people and the best person is the most intelligent, so if you want to be his disciple, you try to find someone you can manipulate. The idea that you could work together with your community to shut down that pattern of behaviour and expel repeat offenders is alien to them, and most of them have trouble with the idea of being honest about what you want.

[-] CinnasVerses@awful.systems 11 points 6 days ago* (last edited 6 days ago)

The foursome had been on the lot for a few months when the pandemic struck in March 2020. That same month, the price of Bitcoin — in which most of Borhanian’s life savings was invested, money that was covering much of the group’s expenses at that time — cratered. Soon after, the four of them stopped paying rent to Lind altogether.

Aella also lost much of her early earnings on crypto.

Curtis Lind reminds me of the businessman who supported Elron early on and lost most of his money.

The end where Gwen Danielson decides that Yudkowsky is ~~her~~ their savior is tragic.

edit/ The article describes Danielson as transfemme but refers to them as them so I will do the same

14
submitted 1 week ago* (last edited 1 week ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

In December I asked who owns Lighthaven? since rationalist organizations were telling their members one thing and the taxman another. I now have a hypothesis.

Since 2022 the rationalists control a $20m event complex called Lighthaven in Berkeley. It is run by organizations with names like Lightcone Infrastructure or the Lightcone Project. They told the taxman that it belonged to CFAR in 2022, 2023, and 2024. CFAR listed a real estate asset and debt liability, so who was the counterparty? You might assume it was a bank, but you would be wrong.

Some facts came out when the FTX estate sued CFAR to recover money which Sam Bankman-Fried gave or loaned them. In 2024, the FTX trustees described the situation thusly:

The complaint alleges that Lightcone got another $20m loan to fund the Rose Garden Inn purchase from Slimrock Investments Pte Ltd, a Singapore-incorporated company owned by Estonian software billionaire, Skype inventor and EA/rationalism adherent Jaan Tallinn. This included the $16.5m purchase price and $3.5m for renovations and repairs.

Slimrock investments has no apparent public-facing website or means of contact. The Guardian emailed Tallinn for comment via the Future of Life Institute, a non-profit whose self-assigned mission is: “Steering transformative technology towards benefiting life and away from extreme large-scale risks.” Tallinn sits on that organization’s board. Neither Tallinn nor the Future of Life Institute responded to the request.

A loan comes with obligation to repay.

Also (Case 22-11068-JTD):

Throughout 2022, FTX Foundation and CFAR were in discussions to purchase the Rose Garden Inn in Berkeley, California as a retreat center for the Effective Altruist community. ... Lightcone RG closed on the purchase of the hotel on or about November 4, 2022. ... The property is subject to a deed of trust and assignment of rents in favor of Slimrock Investments Pte. Ltd., which provided Lightcone with a $20 million loan for the purchase and renovation of the property

Such a deed means that if Lightcone fails to repay the loan, Slimrock owns Lighthaven, much as when someone fails to pay a mortgage and the bank reposesses their house.

When people asked if this endangered the Lighthaven property, Habryka said:

100% of the equity in Lighthaven is owned by a Jaan Tallinn owned company, so it's not really at risk, though the details are a bit messy. I think it's a relevant consideration but not a big one compared to just basic profitability.

In late 2025 he threatened to sell Lighthaven if people did not donate several million dollars.

If we fundraise less than $1.4M (or at least fail to get reasonably high confidence commitments that we will get more money before we run out of funds), I expect we will shut down. We will start the process of selling Lighthaven. I will make sure that LessWrong.com content somehow ends up in a fine place, but I don't expect I would be up for running it on a much smaller budget in the long run.

To help with this, the Survival and Flourishing Fund is matching donations up to $2.6M at an additional 12.5%! This means if we raise $2.6M, we unlock an additional $325k of SFF funding.

SFF does not seem to actually control money, it just recommends that third parties donate money. One of these is probably the Survival and Flourishing Corp, a public-benefit corporation. "Our primary client is philanthropist Jaan Tallinn."

My hypothesis is that Lighthaven is security for a $20m loan from Tallinn's Slimrock company (much like a house is security for a mortgage). That means that donors to Lightcone pay Slimrock, and if they can't keep up Slimrock keeps $20m of freshly renovated real estate in Berkeley (or the value of selling that real estate). That would also imply that Tallinn is collecting tax deductions for giving to a charity whose greatest single expense is repaying money he lent them.

10
submitted 1 month ago* (last edited 1 month ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

I have a feeling that both Jordan "Crémieux Receuil" Lasker and Curtis Yarvin will be present as guests. https://less.online/

Yud made them list him under both Project Lawful/Planecrash and the Sequences/HPMOR. I think he is really proud of his million words of forum posts about D&D, BDSM, and eugenics, like an elderly L. Ron Hubbard really wanted to write another bestseller.

They hope to get the creator of Worm (the ~~webcomic~~ serial fiction that Ziz Lasota was a fan of) and Dominic Cummings. Also some postrationalists like The Last Psychiatrist and Meaningness.

14
CFAR is Back (awful.systems)

Most of Bay area LessWrong operated within two nonprofits, MIRI and CFAR. CFAR was ostensibly about live-in workshops teaching rationality skills, you had to dig deeper to see that the skills were to make you a better Effective Altruist or AI 'risk' 'researcher'. Up to the end of 2024 LessWrong and the Lighthaven campus operated within CFAR as independent projects. CFAR proper does not seem to have done much from spring 2020 to spring 2025, but their head Anna Salamon has started to organize new events. Some highlights:

  • since 2018 they mortgage their own bed-and-breakfast at a mansion in Bodega Bay, CA (about 10% as expensive as Lighthaven in Berkeley)
  • one of their founders left to work as a quant for Jane Street Capital
  • Jessica Taylor had something to say about Salamon in her 2021 debate with Scott Alexander about whether MIRI and CFAR were a lot like the Vassarites and Leverage.

Anna Salamon expressed discontent that Michael Vassar was criticizing ideologies and people that were being used as coordination points, and hyperbolically said he was "the devil". Michael Vassar seemed at the time (and in retrospect) to be the single person who was giving me the most helpful information during 2017. ... Anna Salamon frequently got worried when an idea was discussed that could have negative reputational consequences for her or MIRI leaders. She had many rhetorical justifications for suppressing such information. This included the idea that, by telling people information that contradicted Eliezer Yudkowsky's worldview, Michael Vassar was causing people to be uncertain in their own head of who their leader was, which would lead to motivational problems ("akrasia"). (Vassar tweets things like "Aspergers started out as a malphemism for that Ashkenazi heritage though." and people who have met him say he argues that pedophilia is educational! If you think he provides helpful information that is bad news!)

  • their June 2026 workshops were at Lighthaven
  • Duncan "punch bug" Sabien appeared in the comments of a post in September to say that he would not recommend attending an event with any of these people. He ran Dragon Army while holding down a day job with CFAR and now has a Substack blog.
  • in December Salamon published a retrospective that dances around what went wrong and what she will do differently next time
  • their fundraiser raised $10,000 and did not have any generous benefactors matching small donations
  • someone called Michael "Valentine" Smith left CFAR in 2018, posted a long essay about how he thought obsessing about AI doom in the future was a way not to think about past traumas, and is back to posting profound anthropological insights from his love life to LessWrong:

As far as I know, every culture throughout all known history has made a point of having men and women act as two mostly distinct social clusters most of the time.

Talking about cults and cranks is one angle, but I think you could also talk about how a majority of the leadership of LW and LW-adjacent organizations seem sleazy and dangerous to be around. I hope more people manage to break all the way free from them, rather than quitting CFAR and marrying an OpenPhil staffer, or leaving MIRI and launching their own apocalyptic movement.

24
submitted 1 month ago* (last edited 1 month ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

The essay by Noelle Perdue has some blind spots but she was struck by one of their kink practices:

The wider network of Effectively Altruistic, Bay Area AI tech brotherhood has been covered on and off- in varying degrees of concern- for their seemingly wide community interest in kink, BDSM and “Consensual Non-Consent,” aka rape play. I experienced this myself, sitting in a circle of self-identified rationalists as they explained to me the pleasures of “red means no” parties; full-contact “rape orgies” where participants are encouraged to fight back.

...

(Being forced to have sex is) a relatively common fantasy in individuals, but one I’ve never seen such widespread community interest in outside the Bay Area.

Scott Alexander and Scott Aaronson mostly want a woman to produce and raise babies. Gwern does not seem to post much about sexuality. Kelsey Piper probably keeps that to Tumblr and Project Lawful although she is queer and polyamorous. Caroline Ellison was into submission to men and being in a hierarchical harem. Duncan Sabien didn't mention BDSM fantasies in his post about what he was like in bed. Yudkowsky is into dominance, sadism, and horny Japanese pop culture. Michael Vassar is into much younger women and at least one person says he advocates sex between adults and girls as young as 12 (but not that he commits such acts). Brent Dill liked master/slave relationships with much younger women which are a kind of consensual non-consent. Polyamory is big in this subculture. I don't know much about Burning Man culture. But I can't recall anyone in Bay Area rationalism and EA expressing interest in rape parties until Aella showed up. So is this like Yudkowsky spreading AI doomerism, and Alexander spreading neoreaction?

There is a difference between old school SoCal kink, where you spend a lot of time making fursuits and paddles and occasionally use them with someone fetching, and Aella's version where you rent a house or a field and go to town on each other. Kink culture stresses skill and technical proficiency whereas Aella likes to feel helpless in the power of big strong men. The Rationalists don't like the protective measures which kinksters have learned from experience, like limiting or banning substance use, safewords, and joining a national or international kink community so you can get a second opinion about that proposition on FetLife. (Yudkowsky has posted "of course I use safewords, but what if I didn't?" and I have seen a claim that the rape parties involve games like drugs roulette- Aella claims she has joked about drugs roulette but never actually tried it). Many of them are hostile to mainstream ideas of informed consent, preferring a Libertarian approach where if you sign a contract what happens after is your responsibility.

Edit to mention Vassar

Edit, in her 2023 How my Consensual Nonconsent Orgies Work, Aella says that she has enough Bay Area people to play with and she is no longer actively recruiting outside her current social network. I did not know her playmates included so many LW people given how many kinksters live in the SF Bay Area and given that these are high-risk group activities in an overwhelming sensory space and LW people tend to be cautious introverts with sensory processing issues. She does not respond to a comment reaching out from another Bay Area playspace, or a question about herpes risk.

31
submitted 2 months ago* (last edited 2 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

Does anyone know what this June 2019 text from Epstein is about? I have added some links to RationalWiki and Wikipedia ~~but not corrected spelling~~ and corrected OCR errors. Was it at one of the institutions he sponsored like MIT Media Lab? Or more like his conference in the Virgin Islands? It seems to mix mainstream figures and people in the Libertarian/LessWrong network.

Another correspondent in 2016 suggested inviting Scott Alexander Siskind to speak at a different event Epstein was involved in. The correspondent has a Substack which cites Siskind in 2025.

Obviously just because Epstein had heard of a public figure does not mean that they knew him.

Epstein's words begin below:

  • List for summer talks. David Pizarro. Professor of Psychology and Philosopher at Cornell Univcrsit
  • Eric Weinstein, Mathematician
  • Matthew Putman, Scientist
  • Paul Saffo, Technology Forecaster, and Professor of Engineering
  • Lori Santos, Professor ofPsychology and Cognitive Science
  • Janna Levin, Theoretical Cosmologist
  • Ev Williams, Internet Entrepreneur
  • Phoebe Waller-Bridge, Author
  • Heiner Gocbbels, Composer, and Director
  • Martine Rothblatt, Lawyer and Entrepreneur
  • Peter Thiel, Venture Capitalist, and Entrepreneur
  • Richard Thaler, Behavioral Economics
  • Barbara Tversky, Professor of Psychology
  • Michael Vassar, Futurist, Activist
  • Bret Weinstein, Biologist, and Evolutionary Theorist
  • Susan Hockfield, MIT President, Professor of Neuroscience
  • David Deutsch, Physicist
  • Eliezer Yudkowsky, Al Researcher
  • N. Jeremy Kasdin, Astrophysicist
  • Carl Zimmer, Science Writer
  • Douglas Rushkoff, Media Theorist
  • Eric Topol, Cardiologist
  • Dustin Yellin, Artist
  • Sherry Turkic, Professor of Social Studies
  • Taylor Mac, Actor
  • Stephen Johnson, Author
  • Martin Hagglund, Swedish Philosopher and Scholar of Modernist Literature
  • Thomas Metzinger, Philosopher, and Professor of Theoretical Philosophy
  • Bjarke Ingels, Danish Architect, Founder of BIG, currently working on Floating Cities/Sustainable Habitats project
  • Kai-Fu Lee, Venture Capitalist, Technology Executive, and Al Expert, developed the world's first speaker-independent continuous speech recognition system
  • Poppy Crum, Neuroscientist, and Technologist, Chief Scientist at Dolby Laboratories, Adjunct Professor at Stanford University (Computer Research in Music)
  • Neil Burgess, Researcher, and Professor of Cognitive Neuroscience, investigating the role of the hippocampus in spatial navigation
  • Paul Sloom, Psychologist, and Researcher exploring how children and adults understand the physical and secin' world, with a special focus on language, religion and morality
  • Brian Cox, Physicist, and Professor of Particle Physics, Presenter of Science Programs
  • Eythor Bender. CEO of Berkeley Bionics, Innovator and Business Leader in human augmentation (bionics and robotics)
  • Gwynne Shotwell President. and COO at SpaceX, Engineer. listed in 2018 as the 59th most powerful woman in the world by Forbes
  • Jaap de Roodc. Associate Professor of Evolution (of parasites) and Ecology, focusing on how parasites attack monarch butterflies and in return how butterflies have the ability to self-medicate
  • Jim Holt, American Philosopher, and Contributor to the New York Times writing on string theory, time, the universe, and philosophy
  • Vijay Komar, Indian Roboticist and UPS Foundation Professor in School of Engineering & Applied Science:. became Dean of Penn Engineering, studies flying and cooperative robots
  • Hugh Herr, Biophysicist, Engineer, and Rock Climber, builds prosthetic knees, legs, and ankles that fuse biomechanics with microprocessors at MIT
  • Gabriel Zucman, French Economist at UC Berkeley. best known for his research on tax havens, inequalities, and global wealth
  • Fci-Fei Li, Professor of Computer Science, Director of Stanford's Human-Ccntered Al, works as Chief Scientist of Al/ML of Google Cloud
  • Dennis Hong, Korean American Mechanical Engineer, Professor and Founding Director of RoMeLa (Robotics & Mechanisms Laboratory) of the Mechanical & Aerospace Engineering Department at UCLA
  • Misha (Mikhail) Leonidovich Gromov, American
[-] CinnasVerses@awful.systems 36 points 3 months ago

There is an old principle in software development not to make the GUI too pretty until the back end works, because managers and customers will think its ready when they can click around buttons with nice shading and animations. I think slopware is like that. People see the demo that appears to work and don't see what maintaining it and integrating it with other systems is like.

33
submitted 4 months ago* (last edited 1 month ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

Its almost the end of the year so most US nonprofits which want to remain nonprofits have filed Form 990 for 2024 including some run by our dear friends. This is a mandatory financial report.

  • Lightcone Infrastructure is here. They operate LessWrong and the Lighthaven campus in Berkeley but list no physical assets; someone on Reddit says that they let fellow travelers like Scott Alexander use their old rented office for free. "We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity." They also published a book and website with common-sense, data-based advice for Democratic Party leaders called Deciding to Win which I am sure fills a gap in the literature. Edit: their November 2024 call for donations talks how they spend $16.5m on real estate and $6m on renovations then saw donations collapse is here, an analysis is here
  • CFAR is here. They seem to own the campus in Berkeley but it is encumbered with a mortgage ("Land, buildings, and equipment ... less depreciation; $22,026,042 ... Secured mortgages and notes payable, $20,848,988"). I don't know what else they do since they stopped teaching rationality workshops in 2016 or so and pivoted to worrying about building Colossus. They have nine employees with salaries from $112k to $340k plus a president paid $23k/year
  • MIRI is here. They pay Yud ($599,970 in 2024!) and after failing to publish much research on how to build Friend Computer they pivoted to arguing that Friend Computer might not be our friend. Edit: they had about $16 million in mostly financial assets (cash, investments, etc.) at end of year but spent $6.5m against $1.5m of revenue in 2024. They received $25 million in 2021 and ever since they have been consuming those funds rather than investing them and living off the interest.
  • BEMC Foundation is here. This husband-and-wife organization gives about $2 million/year each to Vox Future Perfect and GiveWell from an initial $38m in capital (so they can keep giving for decades without adding more capital). Edit: The size of the donations to Future Perfect and GiveWell swing from year to year so neither can count on the money, and they gave out $6.4m in 2024 which is not sustainable.
  • The Clear Fund (GiveWell) is here. They have the biggest wad of cash and the highest cashflow.
  • Edit: Open Philanthropy (now Coefficient Giving) is here (they have two sister organizations). David Gerard says they are mainly a way for Dustin Moskevitz the co-founder of Facebook to organize donations, like the Gates, Carnegie, and Rockefeller foundations. They used to fund Lightcone.
  • Edit: Animal Charity Evaluators is here. They have funded Vox Future Perfect (in 2020-2021) and the longtermist kind of animal welfare ("if humans eating pigs is bad, isn't whales eating krill worse?")
  • Edit: Survival and Flourishing Fund does not seem to be a charity. Whereas a Lightcone staffer says that SFF funds Lightcone, SFF say that they just connect applicants to donors and evaluate grant applications. So who exactly is providing the money? Sometimes its Jaan Tallinn of Skype and Kazaa.
  • Centre for Effective Altruism is mostly British but has a US wing since March 2025 https://projects.propublica.org/nonprofits/organizations/333737390
  • Edit: Giving What We Can seems like a mainstream "bednets and deworming pills" type of charity
  • Edit: Givedirectly Inc is an excellent idea in principle (give money to poor people overseas and let them figure out how best to use it) but their auditor flagged them for Material noncompliance and Material weakness in internal controls. The mistakes don't seem sinister (they classified $39 million of donations as conditional rather than unconditional- ie. with more restrictions than they actually had). GiveDirectly, Give What We Can, and GiveWell are all much better funded than the core LessWrong organizations.

Since CFAR seem to own Lighthaven, its curious that Lightcone head Oliver Habryka threatens to sell it if Lightcone shut down. One might almost imagine that boundaries between all these organizations are not as clear as the org charts make it seem. SFGate says that it cost $16.5 million plus renovations:

Who are these owners? The property belongs to a limited liability company called Lightcone Rose Garden, which appears to be a stand-in for the nonprofit Center for Applied Rationality and its project, Lightcone Infrastructure. Both of these organizations list the address, 2740 Telegraph Ave., as their home on public filings. They’ve renovated the inn, named it Lighthaven, and now use it to host events, often related to the organizations’ work in cognitive science, artificial intelligence safety and “longtermism.”

Habryka was boasting about the campus in 2024 and said that Lightcone budgeted $6.25 million on renovating the campus that year. It also seems odd for a nonprofit to spend money renovating a property that belongs to another nonprofit.

On LessWrong Habryka also mentions "a property we (Lightcone) own right next to Lighthaven, which is worth around $1M" and which they could use as collateral for a loan. Lightcone's 2024 paperwork listed the only assets as cash and accounts receivable. So either they are passing around assets like the last plastic cup at a frat party, or they bought this recently while the dispute with the trustees was ongoing, or Habryka does not know what his organization actually owns.

The California end seems to be burning money, as many movements with apocalyptic messages and inexperienced managers do. Revenue was significantly less than expenses and assets of CFAR are close to liabilities. CFAR/Lightcone do not have the $4.9 million liquid assets which the FTX trustees want back and claim their escrow company lost another $1 million of FTX's money.

21
submitted 4 months ago* (last edited 4 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman's "Geeks, Mops, and Sociopaths in Subculture Evolution" to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that "bednet" effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don't know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

14

Form 990 for these organizations mentions many names I am not familiar with such as Tyler Emerson. Many people in these spaces have romantic or housing partnerships with each other, and many attend meetups and cons together. A MIRI staffer claims that Peter Thiel funded them from 2005 to 2009, we now know when Jeffrey Epstein donated. Publishing such a thing is not very nice since these are living persons frequently accused of questionable behavior which never goes to court (and some may have left the movement), but does a concise list of dates, places, and known connections exist?

Maybe that social graph would be more of a dot. So many of these people date each other and serve on each other's boards and live in the SF Bay Area, Austin TX, the NYC area, or Oxford, England. On the enshittified site people talk about their Twitter and Tumblr connections.

10
Stephen and Steven (awful.systems)

We often mix up two bloggers named Scott. One of Jeffrey Epstein's victims says that she was abused by a white-haired psychology professor or Harvard professor named Stephen. In 2020, Vice observed that two Harvard faculty members with known ties to Epstein fit that description (a Steven and a Stephen). The older of the two taught the younger. The younger denies that he met or had sex with the victim. What kind of workplace has two people who can be reasonably suspected of an act like that?

I am being very careful about talking about this.

18

An opposition between altruism and selfishness seems important to Yud. 23-year-old Yud said "I was pretty much entirely altruistic in terms of raw motivations" and his Pathfinder fic has a whole theology of selfishness. His protagonists have a deep longing to be world-historical figures and be admired by the world. Dreams of controlling and manipulating people to get what you want are woven into his community like mould spores in a condemned building.

Has anyone unpicked this? Is talking about selfishness and altrusm common in LessWrong like pretending to use Bayesian statistics?

18
submitted 7 months ago* (last edited 7 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

I used to think that psychiatry-blogging was Scott Alexander's most useful/least harmful writing, because its his profession and an underserved topic. But he has his agenda to preach race pseudoscience and 1920s-type eugenics, and he has written in some ethical grey areas like stating a named friend's diagnosis and desired course of treatment. He is in a community where many people tell themselves that their substance use is medicinal and want proscriptions. Someone on SneerClub thinks he mixed up psychosis and schizophrenia in a recent post.

If you are in a registered profession like psychiatry, it can be dangerous to casually comment on your colleagues. Regardless, has anyone with relevant qualifications ever commented on his psychiatry blogging and whether it is a good representation of the state of knowledge?

34
submitted 7 months ago* (last edited 7 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

Bad people who spend too long on social media call normies NPCs as in video-game NPCs who follow a closed behavioural loop. Wikipedia says this slur was popular with the Twitter far right in October 2018. Two years before that, Maciej Ceglowski warned:

I've even seen people in the so-called rationalist community refer to people who they don't think are effective as ‘Non Player Characters’, or NPCs, a term borrowed from video games. This is a horrible way to look at the world.

Sometime in 2016, an anonymous coward on 4Chan wrote:

I have a theory that there are only a fixed quantity of souls on planet Earth that cycle continuously through reincarnation. However, since the human growth rate is so severe, the soulless extra walking flesh piles around us are NPC’s (sic), or ultimate normalfags, who autonomously follow group think and social trends in order to appear convincingly human.

Kotaku says that this post was rediscovered by the far right in 2018.

Scott Alexander's novel Unsong has an angel tell a human character that there was a shortage of divine light for creating souls so "I THOUGHT I WOULD SOLVE THE MORAL CRISIS AND THE RESOURCE ALLOCATION PROBLEM SIMULTANEOUSLY BY REMOVING THE SOULS FROM PEOPLE IN NORTHEAST AFRICA SO THEY STOPPED HAVING CONSCIOUS EXPERIENCES." He posted that chapter in August 2016 (unsongbook.com). Was he reading or posting on 4chan?

Did any posts on LessWrong use this insult before August 2016?

Edit: In HPMOR by Eliezer Yudkowsky (written in 2009 and 2010), rationalist Harry Potter calls people who don't do what he tells them NPCs. I don't think Yud's Harry says they have no souls but he has contempt for them.

view more: next ›

CinnasVerses

joined 8 months ago