[-] lurker@awful.systems 3 points 1 day ago

Damn, she went in on Yud (not that he doesn’t deserve it)

I also found this article about the doc in the replies of that post https://buttondown.com/maiht3k/archive/a-tale-of-two-ai-documentaries/ and it’s a very interesting read

[-] lurker@awful.systems 2 points 2 days ago

Update: the screenshot is unfortunately not LLM generated, found the full version on Reddit SneerClub https://web4.ai/

[-] lurker@awful.systems 3 points 2 days ago* (last edited 2 days ago)

New “AI is not a bubble” video just dropped https://youtu.be/wDBy2bUICQY a lot of skeptical comments pointing out the flaws in this argument while the creator tries to defend themselves with mostly mediocre lines

[-] lurker@awful.systems 3 points 2 days ago* (last edited 2 days ago)

I took a deeper look into the documentary, and it does go into both the pessimist and optimist perspectives, so their inclusion makes more sense. and yeah, I was trying to get at how they're skeptical of the TESCREAL stuff and of current LLM capabilities

[-] lurker@awful.systems 3 points 2 days ago* (last edited 2 days ago)

I poked around the IMDB page, and there are reviews! currently it's sitting at a 8.5/10 with 31 ratings (though no written reviews it seems like) the metacritic score is a 51/100 with 4 reviews and there are 4 external reviews

[-] lurker@awful.systems 15 points 2 days ago

HOLY SHIT LMFAOOOOO

[-] lurker@awful.systems 3 points 2 days ago* (last edited 2 days ago)

Sam Altman and the other CEOS being there is such a joke “this technology is so dangerous guys! of course I’m gonna keep blocking regulation for it, I need to make money after all!” Also, I’m shocked Emily Bender and Timmit Gebru are there, aren’t they AI skeptics?

[-] lurker@awful.systems 3 points 2 days ago

huh, you’re right. usually this channel provides a source for the things they share, but this time there’s nothing.

[-] lurker@awful.systems 8 points 3 days ago

the full paper is here: https://x.com/alexwg/status/2022292731649777723 immediately two references to Nick Bostrom and Scott Alexander

[-] lurker@awful.systems 6 points 3 days ago

Im pretty sure most of this has already been posted to this thread (I know the “AI published a hit piece on me” thing was)but more Moltbook/Openclaw/whatever-it’s-called nonsense

11
submitted 1 week ago* (last edited 1 week ago) by lurker@awful.systems to c/sneerclub@awful.systems

this was already posted on reddit sneerclub, but I decided to crosspost it here so you guys wouldn’t miss out on Yudkowsky calling himself a genre savy character, and him taking what appears to be a shot at the Zizzians

29
submitted 2 weeks ago* (last edited 2 weeks ago) by lurker@awful.systems to c/sneerclub@awful.systems

originally posted in the thread for sneers not worth a whole post, then I changed my mind and decided it is worth a whole post, cause it is pretty damn important

Posted on r/HPMOR roughly one day ago

full transcript:

Epstein asked to call during a fundraiser. My notes say that I tried to explain AI alignment principles and difficulty to him (presumably in the same way I always would) and that he did not seem to be getting it very much. Others at MIRI say (I do not remember myself / have not myself checked the records) that Epstein then offered MIRI $300K; which made it worth MIRI's while to figure out whether Epstein was an actual bad guy versus random witchhunted guy, and ask if there was a reasonable path to accepting his donations causing harm; and the upshot was that MIRI decided not to take donations from him. I think/recall that it did not seem worthwhile to do a whole diligence thing about this Epstein guy before we knew whether he was offering significant funding in the first place, and then he did, and then MIRI people looked further, and then (I am told) MIRI turned him down.

Epstein threw money at quite a lot of scientists and I expect a majority of them did not have a clue. It's not standard practice among nonprofits to run diligence on donors, and in fact I don't think it should be. Diligence is costly in executive attention, it is relatively rare that a major donor is using your acceptance of donations to get social cover for an island-based extortion operation, and this kind of scrutiny is more efficiently centralized by having professional law enforcement do it than by distributing it across thousands of nonprofits.

In 2009, MIRI (then SIAI) was a fiscal sponsor for an open-source project (that is, we extended our nonprofit status to the project, so they could accept donations on a tax-exempt basis, having determined ourselves that their purpose was a charitable one related to our mission) and they got $50K from Epstein. Nobody at SIAI noticed the name, and since it wasn't a donation aimed at SIAI itself, we did not run major-donor relations about it.

This reply has not been approved by MIRI / carefully fact-checked, it is just off the top of my own head.

33

I searched for “eugenics” on yud’s xcancel (i will never use twitter, fuck you elongated muskrat) because I was bored, got flashbanged by this gem. yud, genuinely what are you talking about

view more: next ›

lurker

joined 3 weeks ago