24
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 17 Aug 2025
24 points (100.0% liked)
TechTakes
2205 readers
365 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
In case you needed more evidence that the Atlantic is a shitty rag.
The phrase "adorned with academic ornamentation" sounds like damning with faint praise, but apparently they just mean it as actual praise, because the rot has reached their brains.
The implication that Soares / MIRI were doing serious research before is frankly journalist malpractice. Matteo Wong can go pound sand.
It immediately made me wonder about his background. He's quite young and looks to be just out of college. If I had to guess, I'd say he was probably a member of the EA club at Harvard.
His group chats with Kevin Roose must be epic.
He was writing less dumb shit a coupla years ago
Just earlier this month, he was brushing off all the problems with GPT-5 and saying that "OpenAI is learning from its greatest success." He wrapped up a whole story with the following:
Weaselly little promptfucker.
also, they misspelled "Eliezer", lol
I've created a new godlike AI model. Its the Eliziest yet.
My copy of "the singularity is near" also does that btw.
(E: Still looking to confirm that this isn't just my copy, or it if is common, but when I'm in a library I never think to look for the book, and I don't think I have ever seen the book anywhere anyway. It is the 'our sole responsibility...' quote, no idea which page, but it was early on in the book. 'Yudnowsky').
Image and transcript
Transcript: Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve....[T]here are no hard problems, only problems that are hard to a certain level of intelligence. Move the smallest bit upwards [in level of intelligence], and some problems will suddenly move from "impossible" to "obvious." Move a substantial degree upwards and all of them will become obvious.
—ELIEZER S. YUDNOWSKY, STARING INTO THE SINGULARITY, 1996
Transcript end.
How little has changed, he has always believed intelligence is magic. Also lol on the 'smallest bit'. Not totally fair to sneer at this as he wrote this when he was 17, but oof being quoted in a book like this will not have been good for Yudkowskys ego.
The Atlantic puts the "shit" in "shitlib"