That’s an AI governance PhD right there!
Bill Gates is having a normal one.
https://www.cnbc.com/2025/03/26/bill-gates-on-ai-humans-wont-be-needed-for-most-things.html
Wanting to escape the fact that we are beings of the flesh seems to be behind so much of the rationalist-reactionary impulse – a desire to one-up our mortal shells by eugenics, weird diets, ‘brain uploading’ and something like vampirism with the Bryan Johnson guy. It’s wonderful you found a way to embrace and express yourself instead! Yes, in a healthier relationship with our bodies – which is what we are – such changes would be considered part of general healthcare. It sometimes appears particularly extreme in the US from here from Europe at least, maybe a heritage of puritanical norms.
Reminds me of the stories of how Soviet peasants during the rapid industrialization drive under Stalin, who’d never before seen any machinery in their lives, would get emotional with and try to coax faulty machines like they were their farm animals. But these were Soviet peasants! What are structural forces stopping Yud & co outgrowing their childish mystifications? Deeply misplaced religious needs?
A generous interpretation may be that writing music in the context of the modern music industry may indeed be something that’s creatively unsatisfying for composers, but the solutions to that have nothing to do with magical tech-fixes and everything to do with politics, which is of course anathema to these types. What dumb times we live in.
Testing for genetic defects is very different from the Gattaca-premise of most everything about a person being genetically deterministic, with society ordered around that notion. My point was that such a setting is likely inherently impossible, since “heritability” doesn’t work like that; the most techbros can do is LARP at it, which, granted, can be very dangerous on its own – the fact that race is a social construct doesn’t preclude racism and so on. But there’s no need to get frightened by science fiction when science facts tell a different story.
Amazing quote he included from Tyler Cowen:
If you are ever tempted to cancel somebody, ask yourself “do I cancel those who favor tougher price controls on pharma? After all, they may be inducing millions of premature deaths.” If you don’t cancel those people — and you shouldn’t — that should broaden your circle of tolerance more generally.
Yes leftists, you not cancelling someone campaigning for lower drug prices is actually the same as endorsing mass murder and hence you should think twice before cancelling sex predators. It’s in fact called ephebophilia.
What the globe emoji followed with is also a classic example of rationalists getting mesmerized by their verbiage:
What I like about this framing is how it aims to recalibrate our sense of repugnance in light of “scope insensitivity,” a deeply rooted cognitive bias that occurs “when the valuation of a problem is not valued with a multiplicative relationship to its size.”
Where did you get that impression from? He says himself he is not advocating against aid per se, but that its effects should be judged more holistically, e.g. that organizations like GiveWell should also include the potential harms alongside benefits in their reports. The overarching message seems to be one of intellectual humility – to not lose sight that the ultimate aim is to help another human being who in the end is a person with agency just like you, not to feel good about yourself or to alleviate your own feelings of guilt.
The basic conceit of projects like EA is the incredible high of self-importance and moral superiority one can get blinded by when one views themselves as more important than other people by virtue of helping so many of them. No one likes to be condescended to; sure, a life saved with whatever technical fix is better than a life lost, but human life is about so much more than bare material existence – dignity and freedom are crucial to a good life. The ultimate aim should be to shift agency and power into the hands of the powerless, not to bask in being the white knight trotting around the globe, saving the benighted from themselves.
How much money would be saved by just funneling the students of these endless ‘AI x’ programs back to the humanities where they can learn to write (actually good) science fiction to their heart’s content? Hey, finally a way AI actually lead to some savings!