[-] Soyweiser@awful.systems 1 points 6 hours ago* (last edited 6 hours ago)

That is a still quite high right? Esp considering they think 5% of nul-a is quite high. (For some reason I once had two copies of that). (I have read nul-a and not metamorp of prime)

[-] Soyweiser@awful.systems 6 points 10 hours ago

The manosphere lingo, the header image with the leather jacket and the fake signing of a boob, the self dealing, the pretending they pay all their devs/researchers a lot of money, the who uses the much tokens leaderboard. There is such a high amount of sick desperation in all this.

We were to hard on the previous wave, who like Balmer were just cringe capitalist overlords.

[-] Soyweiser@awful.systems 6 points 1 day ago

I think that if someone were to be as obssessed with living forever as LW are, it would be seen as a form of mental illness and the Minds would gently try to correct it.

Yeah, I don't think they would care if it was just a few, or a small group, but culture people who start to claim others are deathists and the extreme of whom have all kinds weird violent thoughts on them would be concerning. Doubt it would be a huge concern to the minds however, they prob only really get active when one of them also starts wants to create an empire or something, but it is hard to amass resources for that in the culture, esp if no mind is on your side.

Do wonder why we never see culture people who worship the minds as gods.

[-] Soyweiser@awful.systems 6 points 1 day ago

This gives me very high live service video game monetization feelings, another reason to stay far away from it. At least they don't have the thing where every times costs multiples of 50 and you buy tokens amounts not divisible by 50.

[-] Soyweiser@awful.systems 4 points 1 day ago

Something is not reich about that.

[-] Soyweiser@awful.systems 9 points 2 days ago

Interesting that in the comments somebody also mentions that the people of the culture euthanize after a couple of centuries. No big shock that the LW people would disagree with that, as parts of the LW idea space is living forever in a computer simulation. So the culture can't be utopian or good just because of that.

[-] Soyweiser@awful.systems 3 points 3 days ago

Wtf is up with the face in the top left circle.

[-] Soyweiser@awful.systems 3 points 3 days ago* (last edited 3 days ago)

I think that, while many LessWrong readers do believe that one party is way better than the other, such that the inter-party quality variation is far larger than the intra-party quality variation, this is not true of all readers.

... Wait is this about race and iq again?

Anyway the math ain't mathing, as there never can be a republican above average enough to counterbalance out that they are an republican.

[-] Soyweiser@awful.systems 9 points 5 days ago

This person also seems to have no concept of the finality of death

The god ai can perfectly simulate people, and a sa copy is you, death isnt permanent. And when you start to think this is inevitable and close, murder becomes just another way to signal how strongly you feel about a thing.

[-] Soyweiser@awful.systems 10 points 5 days ago

soul forges continually working to bring back the dead

Even in death, duty does not end.

[-] Soyweiser@awful.systems 6 points 5 days ago

'top ai'. so it is a sex thing after all.

Just looking at that picture makes me hear googles of unsimulated people scream in terror.

132
submitted 2 weeks ago* (last edited 2 weeks ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As we don't have a top level post about this already (nor on reddit) I thought why not make one. Archive.is

Extremely likely the guy was a lesswronger, or at least radicalized by that sort of thinking.

But not much else seems to be known as far as I can tell. Corbin also posted about the HN reactions in the stubsack.

And remember, no fed posting.

Edit: looks like his house also got shot. Archive (after the speculation in this thread, makes you wonder if this was a follow up false flag, as the bottle didn't break last time).

15

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

22
submitted 8 months ago* (last edited 8 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

15
submitted 8 months ago* (last edited 8 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 11 months ago* (last edited 11 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

12
submitted 2 years ago* (last edited 2 years ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

19

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›

Soyweiser

joined 2 years ago