[-] Soyweiser@awful.systems 3 points 3 hours ago* (last edited 3 hours ago)

Nope. And tbh, did some dd2 recently, and for a very short while I was tempted to push the edit button, but then I remembered that fextralife just tries to profit off my wiki editing labor. (I still like the idea of wikis, but do not have the fortitude and social calm to edit a mainstream one like wikipedia). (I did a quick check, and yeah I also really hate the license fextralife/valnet uses "All contributions to any Fextralife.com Wiki fall under the below Contribution Agreement and become the exclusive copyrighted property of Valnet.", and their editor sucks ass (show me the actual code not this wysiwyg shit)).

[-] Soyweiser@awful.systems 4 points 8 hours ago

Like Clinton starting a go fund me for a coworker with cancer, the rich and their money are not voluntarily parted.

[-] Soyweiser@awful.systems 4 points 9 hours ago* (last edited 9 hours ago)

See also how youtube tutorials have mostly killed (*) text based tutorials/wikis and are just inferior to good wikis/text based ones. Both because listening to a person talk is a linear experience, and a text one allows for easy scrolling, but also because most people are just bad at yt tutorials. (shoutout to the one which had annoyingly long random pauses in/between sentences even at 2x speed).

This is not helped because now youtube is a source of revenue, and updating a wiki/tutorial often is not. So the incentives are all wrong. A good example of this is the gaming wiki fextralife: See this page on dragons dogma 2 npcs. https://dragonsdogma2.wiki.fextralife.com/NPCs (the game has been out for over a year, if the weirdness doesn't jump out at you). But the big thing for fextralife is their youtube tutorials and it used to have an autoplaying link to their streams. This isn't a wiki, it is an advertisement for their youtube and livestreams. And while this is a big example the problem persists with smaller youtubers, who suffer from extreme publish, do not deviate from your niche or perish. They can't put in the time to update things, because they need to publish a new video (on their niche, branching out is punished) soon or not pay rent. (for people who play videogames and or watch youtube out there, this is also why somebody like the spiffing brit is has long ago went from 'I exploit games' to 'I grind and if you grind enough in this single player game you become op', the content must flow, but eventually you will run out of good new ideas (also why he tried to push his followers into doing risky cryptocurrency related 'cheats' (follow Elon, if he posts a word that can be cryptocoined, pump and dump it for a half hour))).

*: They still exist but tend to be very bad quality, even worse now people are using genAI to seed/update them.

[-] Soyweiser@awful.systems 9 points 10 hours ago* (last edited 9 hours ago)

Sealions is a bit more specific, as they do not stop, and demand way more evidence than is normal, Scott had a term for this, forgot it already (one of those more useful Rationalist ideas, which they only employ themselves asymmetrically). Noticed it recently on reddit, some person was mad I didn't properly counter Yuds arguments, while misrepresenting my position (which wasn't that strong tbh, I just quickly typed them up before I had other things to do). But it is very important to take Yuds arguments seriously for some reason, reminds me of creationists.

Think just calling them AI concern trolls works.

[-] Soyweiser@awful.systems 8 points 10 hours ago* (last edited 10 hours ago)

So bit of a counter to our usual stuff thing. But a worker migrant here won a case against his employer who had linked his living space to his employment contract (forbidden) using chatgpt as an aid (how much is not told). So there actually was a case where it helped.

Interesting note on it, these sorts of cases have no jurisprudence yet, so that might have been a factor. No good links for it sadly as it was all in Dutch. (Cant even find a proper writeup in a bigger news site as a foreigner defending their rights against abuse is less interesting than some other country having a new bisshop). Skeets congratulating the guy here https://bsky.app/profile/isgoedhoor.bsky.social/post/3m27aqkyjjk2c (in Dutch). Nothing much about the genAI usage.

But this does fit a pattern, how, like with blind/bad eyesight people, these tools are veing used by people who have no other recourse because we refuse to help them (this is bad tbh, Im happy they are getting more help don't fet me wrong, but it shouldn't be this substandard).

[-] Soyweiser@awful.systems 5 points 10 hours ago

Yes that is what people want, more generic shit.

[-] Soyweiser@awful.systems 12 points 23 hours ago* (last edited 23 hours ago)

Tyler Cowen saying some really weird shit about an AI 'actress'.

(For people who might wonder why he is relevant. See his 'see also' section on wikipedia)

E: And you might wonder, rightfully imho, that this cannot be real, that this must be an edit. https://archive.is/vPr1B I have bad news.

[-] Soyweiser@awful.systems 9 points 1 day ago* (last edited 1 day ago)

Friend: "I have a problem"

Me, with a stack of google printouts: "My time to shine!".

E: ow god, I thought the examples were multiple and the friend one was just a random one. No, it was the first example. 'I gave my friend a printout, which saved me time'. Also, as I assume the friend still is unhoused, and they didn't actually use the printout yet, he doesn't know if this actually helped. Atwood isn't a 'helping the unhoused' expert. He just assumed it was a good source. The story ends when he hands over the paper.

Also very funny that he is also going 'you just need to know how to ask questions the right way, which I learned by building stackoverflow'. Yeah euh, that is not a path a lot of people can follow up in.

[-] Soyweiser@awful.systems 6 points 1 day ago

There have been a lot of cases in history of smart people being bested by the dumbest people around who just had more guns/a gun/copious amounts of meth/a stupid idea but they got lucky once, etc.

I mean, if they are so smart, why are they stuck in a locker?

[-] Soyweiser@awful.systems 5 points 1 day ago* (last edited 23 hours ago)

Secretly been eyeing your prompt. Are you ready to get spontaneous? Just say so.

(Somebody linked 2 chatgpts (or groks, I don't recall which anus like logo it was) speaking to each other and they kept repeating variants of the lasts bits).

E: bingo this one: https://www.tiktok.com/@aarongoldyboy/video/7555260691947588895

[-] Soyweiser@awful.systems 4 points 1 day ago* (last edited 1 day ago)

One the one hand that ani thing has the most cringe tone for a chatbot. "Secretly been eyeing your [blank]..." (Damn grok sounds formulaic) but otoh they are debating a bot. You can only do that the first few months you are online, after that you should know better.

You cannot simultaneously claim to care about the "declining birth rate" while also supporting AI "companions"

Actually eugenicists can, quite easy actually. (Repeating the word 'degenerate' and not getting this is quite something).

This is transhumanist

No.

[-] Soyweiser@awful.systems 5 points 1 day ago* (last edited 23 hours ago)

E: eurgh, had forgotten the whole bsky waffle thing. Wasn't making a reference to that, sorry if it came off as insensitive. Got rid of the comment anyway.

(I had totally forgotten this post, and also didn't expect the CEO to double/triple down, and now make mentioning waffles into a anti-trans dogwhistle).

13

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

21
submitted 1 month ago* (last edited 1 month ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

15
submitted 1 month ago* (last edited 1 month ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 4 months ago* (last edited 4 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

12
submitted 2 years ago* (last edited 2 years ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

19

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›

Soyweiser

joined 2 years ago