[-] Soyweiser@awful.systems 21 points 2 months ago* (last edited 2 months ago)

Time to start a new business to distract people the previous one is not living up to expectations. Musk style. (A degree of vertical integration will also be involved)

E: '"The Grok integration with X has made everyone jealous,” says someone working at another big AI lab. “Especially how people create viral tweets by getting it to say something stupid.”' These are deeply unserious people, billions of dollars just to build AI Dril, which was already a thing. Is there some weird nerd somewhere who said something like 'culture is downstream from viral memes' or something just as dumb? Related to that, so I have basically quit twitter, and only visit there to look up if a quoted thing was real etc, but damn the site has gotten bad. How does anybody use it when so many replies are either bot replies, ai replies or people using their checkmark to push their one word replies to the top? Esp bigger accounts/viral tweets just get swarmed with shit. It has a bit of the 'comment section of abandoned blog' feeling to it. And this is the validation the AI company craves?

[-] Soyweiser@awful.systems 21 points 6 months ago* (last edited 6 months ago)

The combination of 'I realized wanting to be elon is cringe' to -> 'im gonna join doge' -> 'no im gonna do physics' (like elon).

Yo dude you were so close at first, you almost had it! You fool!

Really hope that friend was also just joking because he knew there would be no crash as the default can be extended. (But, considering rich peoples friends are usually all in the same mindspace as the rich person (see for example the text messages revealed in the Musk vs Twitter lawsuit) I doubt this).

[-] Soyweiser@awful.systems 21 points 8 months ago

For any drive by readers who slightly want to know more background about some stuff: Why Richard Feynman wasn't a great rolemodel.

(more context for the drive by readers, Feynman is often held up as a great man in STEM circles, and even more so in the world of LessWrong Rationalism. (Which is fine if it is about scientific achievements, but it often goes beyond that as here in their page on 'traditional rationality')).

[-] Soyweiser@awful.systems 21 points 8 months ago

The intern was only doing their fiduciary duty to the shareholders!

[-] Soyweiser@awful.systems 21 points 8 months ago

Hey mate, did you get your PhD or a fucking Nobel in linguistics by any chance?

Early onset Nobel disease.

[-] Soyweiser@awful.systems 21 points 11 months ago

"I know not with what GPT-5 will will reply with, but GPT-6 will reply with 'Unfortunately as an language model I Unfortunately as an language model I I I Unfortunately Unfortunately Unfortunately model model model'" Albert GPTstein.

[-] Soyweiser@awful.systems 21 points 11 months ago* (last edited 11 months ago)

Roko even said the 'I think agricultural waste products like straw can be substituted for sawdust so maybe you are paid to take it off their hands.[emph mine]' line. which is always a good sign when somebody is trying to the economically feasible math.

[-] Soyweiser@awful.systems 21 points 11 months ago* (last edited 11 months ago)

Themotte is a spinoff from the slatestarcodex subreddit, they used to have a culture war thread every week, which always filled up with (far) right culture war grievances which they then all pretended to talk neutral about. (The effect was mostly that peoples overton window was drawn right). Eventually the stink of it got so bad that Scott Alexander had to take action (he wrote slatestarcodex before he sold out and now runs a substack), so he told them with his semi blessing to go make their own subreddit.

Which was pretty bad, a place where you could said anything (esp rightwing) as long as you used enough words to explain it. (the 14 words did sneak at times, and I saw (sadly didn't archive it) openly neo-nazi homesteaders try to recruit there. So a lot of 'themotte people didn't think they were on the same side as the far right, but the far right certainly did think that'). Etc etc. Eventually the reddit admins got more and more annoyed with the place, and (this is a bit speculative on my part) after somebody got banned(*) for explaining the triple parenthesis dogwhistle by the admins they freaked out and made their own site.

*: A thing I noticed at the time, which themotte people didn't this guy who got banned was from Germany, which has a little bit stronger rules re antisemitism so the admins banning them for that wouldn't surprise me, esp if it was automated. And for the people who don't know, on reddit every forum has mods who come from the forum(called a subreddit), and the site itself has admins, so that is why there is sometimes some moderation conflict. The mods were afraid the admins were about to ban the subreddit, which would destroy all the past content, so they made a new site.

E: also wow themotte has gotten a lot worse when it went offsite. Jesus some of the comments in that thread. I feel bad for TWG.

TL;DR:

What is this, rat-4chan?

Yes

[-] Soyweiser@awful.systems 21 points 1 year ago* (last edited 1 year ago)

partially due to impossible bandwidth and compression requirements

It still amazes me they publicly posted a request for help with these compression req which are physically impossible to achieve. Nobody with a CS degree is anybody near the leadership of neuralink. In other words, you are downplaying how impossible the requirements were.

[-] Soyweiser@awful.systems 21 points 1 year ago* (last edited 1 year ago)

Not only is this real, I think this is a paraphrase of a thing Yud wrote. Which makes it even cultier. (A reason why I called the Rationalismsphere a cult incubator, as their teachings make you more susceptible to getting into cults).

Edit: For example on his writing on cults, and more 'Every Cause Wants To Be A Cult'. (Look Cade Metz is referenced, before they turned on him)

[-] Soyweiser@awful.systems 21 points 1 year ago

Austin from Manifest responds that leftist views would obviously much more damaging to EA than racist ones, because reasons.

I can accept racism, but I draw the line at suggesting that self enrichment is bad.

[-] Soyweiser@awful.systems 21 points 2 years ago* (last edited 2 years ago)

For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is 'cheap' 'easy' and 'everywhere'. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.

Men will nuke datacenters before going to therapy for grief and their mid life crisis.

view more: ‹ prev next ›

Soyweiser

joined 2 years ago