19

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Previous week

top 50 comments
sorted by: hot top controversial new old
[-] BlueMonday1984@awful.systems 3 points 12 hours ago

New Blood in the Machine about GPT-5's dumpster fire launch: GPT-5 is a joke. Will it matter?

[-] TinyTimmyTokyo@awful.systems 14 points 18 hours ago

Ozy Brennan tries to explain why "rationalism" spawns so many cults.

One of the reasons they give is "a dangerous sense of grandiosity".

the actual process of saving the world is not very glamorous. It involves filling out paperwork, making small tweaks to code, running A/B tests on Twitter posts.

Yep, you heard it right. Shitposting and inconsequential code are the proper way to save the world.

[-] gerikson@awful.systems 9 points 15 hours ago

JFC

Agency and taking ideas seriously aren’t bad. Rationalists came to correct views about the COVID-19 pandemic while many others were saying masks didn’t work and only hypochondriacs worried about covid; rationalists were some of the first people to warn about the threat of artificial intelligence.

First off, anyone not entirely into MAGA/Qanon agreed that masks probably helped more than hurt. Saying rats were outliers is ludicrous.

Second, rats don't take real threats of GenAI seriously - infosphere pollution, surveillance, autopropaganda - they just care about the magical future Sky Robot.

[-] sleepundertheleaves@infosec.pub 8 points 12 hours ago* (last edited 12 hours ago)

Unfortunately, in the spring of 2020, the CDC was discouraging people from wearing masks, and was saying masking would do more harm than good:

U.S. health authorities had discouraged healthy Americans from wearing facial coverings for weeks, saying they were likely to do more harm than good in the fight against the coronavirus — but now, as researchers have learned more about how the highly contagious virus spreads, officials have changed their recommendations.

U.S. health authorities have long maintained that face masks should be reserved only for medical professionals and patients suffering from COVID-19, the deadly disease caused by the coronavirus. The CDC had based this recommendation on the fact that such coverings offer little protection for wearers, and the need to conserve the country's alarmingly sparse supplies of personal protective equipment.

I pretty clearly remember the mainstream media and various liberal talking heads telling people not to mask up back then - mostly because the US was completely unprepared for a pandemic, and they thought they had to discourage people from buying masks to make sure hospitals would have enough.

Meanwhile, the right-wing prepper types were breaking out the N95 masks they'd stockpiled for a pandemic, warning each other COVID was much more contagious and lethal than the government wanted to admit, passing around conspiracy theories about millions of deaths in China covered up by the CCP, and patting themselves on the back for stockpiling masks before the government took them off the shelf.

Then some analyst told Trump that letting COVID spread unchecked would hurt blue states worse than red states, so he had Fox News start anti-masking talking points, and all those conservative foot soldiers put away their masks and became super spreaders for Jesus.

But yeah. During that period from like January to March 2020, the political division around COVID was basically the opposite of what it became, and I can easily believe some "rationalists" were calling bullshit on the CDC suddenly telling people not to buy masks.

[-] Soyweiser@awful.systems 1 points 47 minutes ago

Meanwhile, the right-wing prepper types were breaking out the N95 masks they’d stockpiled for a pandemic

This included Scott ssc btw. Who also claimed that stopping smoking helped against cov. Not that he had any proof (the medical science at the time even falsely (it came out later) claimed smoking helped agains covid). But only the CDC gets judged, not the ingroup.

And other Scott blamed people who sneer for making covid worse. (While at sneerclub we were going, take this seriously and wear a mask).

So annoying Rationalists are trying to spin this into a win for themselves. (They also were not early, their warnings matched the warnings of the WHO, looked into the timelines last time this was talked about).

That's how I remember it too. Also the context about conserving N95 masks always feels like it gets lost. Like, predictably so and I think there's definitely room to criticize the CDC's messaging and handling there, but the actual facts here aren't as absurd as the current fight would imply. The argument was:

  1. With the small droplet size, most basic fabric masks offer very limited protection, if any.
  2. The masks that are effective, like N95 masks, are only available in very limited quantities.
  3. If everyone panic-buys N95 the way they did toilet paper it will mean that the people who are least able to avoid exposure i.e. doctors and medical frontliners are at best going to wildly overpay and at worst won't be able to keep supplied.
  4. Therefore, most people shouldn't worry about masking at this stage, and focus on other measures like social distancing and staying the fuck home.

I think later research cast some doubt on point 1, but 2-4 are still pretty solid given the circumstances that we (collectively) found ourselves in.

[-] swlabr@awful.systems 6 points 18 hours ago

But doctor, I am L7 twitter manager Pagliacci

[-] froztbyte@awful.systems 4 points 18 hours ago

oldskool OSI appmanager is oldskool

(........sorry)

[-] swlabr@awful.systems 4 points 18 hours ago

I'm gonna need this one explained, boss

[-] froztbyte@awful.systems 3 points 18 hours ago
[-] froztbyte@awful.systems 3 points 18 hours ago

(in networking it's common terminology to refer to "Lx" by numerical reference, and broadly understood to be in reference to this)

[-] swlabr@awful.systems 4 points 17 hours ago

Aaaaa gotcha. It’s probably obvious but in my case I meant L7 manager as in “level 7 manager”, a high tier managerial position at twitter, probably. I don’t know what exact tiering system twitter uses but I know other companies might use “Lx” to designate a level.

[-] froztbyte@awful.systems 4 points 16 hours ago

I figured, but I couldn't just let a terrible pun slip me by!

[-] swlabr@awful.systems 4 points 16 hours ago

Hollywood Director Richard Link Layer. Is this anything

[-] Soyweiser@awful.systems 6 points 16 hours ago

I have not tried it yet, but apparently there is an open source alternative for github called https://codeberg.org/. Might be useful.

[-] BlueMonday1984@awful.systems 7 points 13 hours ago

It'll probably earn a lot of users if and when Github goes down the shitter. They've publicly stood with marginalised users before, so they're already in my good books.

[-] nfultz@awful.systems 4 points 10 hours ago

It’ll probably earn a lot of users if and when Github goes down the shitter.

I'd argue GH is well on it's way, probably jumped around the time Hacktoberfest morphed into a DDoS on maintainers. Or maybe more recently, when they handed peoples repos (and API keys lol) over to Copilot. Or maybe earlier, when they started calling their users "maintainers" instead of "developers". Sometime in the last 6 years though.

There have been a number of contenders over the years - gitlab, gitea but none of them have been able to brand/market well enough to really to really impact GH or to compete with the subsidized free storage and Actions credits plus switching costs. Even Atlassian / BB is largely irrelevant.

[-] BlueMonday1984@awful.systems 12 points 19 hours ago

Tante fires off about web search:

There used to be this deal between Google (and other search engines) and the Web: You get to index our stuff, show ads next to them but you link our work. AI Overview and Perplexity and all these systems cancel that deal.

And maybe - for a while - search will also need to die a bit? Make the whole web uncrawlable. Refuse any bots. As an act of resistance to the tech sector as a whole.

On a personal sidenote, part of me suspects webrings and web directories will see a boost in popularity in the coming years - with web search in the shitter and AI crawlers being a major threat, they're likely your safest and most reliable method of bringing human traffic to your personal site/blog.

[-] BurgersMcSlopshot@awful.systems 9 points 19 hours ago

Mastodon post linking to the least shocking Ars lede I have seen in a bit. Apparently "reasoning" and "chain of thought" functionality might have been entirely marketing fluff? :shocked pikachu:

[-] Soyweiser@awful.systems 3 points 16 hours ago

Wait, but if they lied about that... what else do they lie about?

[-] mountainriver@awful.systems 1 points 13 hours ago

Not a sneer, but I recently saw Ari K's AI generated video of Trump in his golden ballroom. It's quite good, here is the channel: https://m.youtube.com/@AriKuschnir

Looking at his other videos, he is a talented story teller. Most videos are about two minutes, has numerous short shots of a few seconds and a voice over or music connecting the shots. So presumably he generates the shots, splice them together and puts the soundtrack over the it. Most of the short stories are dreamlike. To the extent it has characters it's famous people (getting their comeuppance), so even though they look a bit different in each shot, it's easy to keep track.

I think it's interesting because by doing what can be done with the tools, it illustrates the limitations. In the hands of a good story teller you essentially get an illustration for a short radio play (and the radio play needs to be recorded separately, and you can't show actors talking). Because of the bubble and investor bux, it can right now be done on a shoe string budget.

But that's all! Are illustrated radio plays replacing feature films? No, so this remains a niche use case. And once the investor bux dries up, potentially an expensive one. Not something to build a billion dollar industry on.

[-] Amoeba_Girl@awful.systems 1 points 1 hour ago

I'm sorry, I believe that there are legitimate artistic uses of neural networks (and they're never about cutting budget), but this is just fascist aesthetics repurposed to serve anti-Trump messaging. Do not like.

[-] BurgersMcSlopshot@awful.systems 5 points 12 hours ago

Where are you getting "talented storyteller" from? The whole thing is some heavy-handed ham-fisted fever dream that I would expect from some liberal engagement farm. And "illustrated radio play?" What are you even on about.

The video looks like garbage and the rapid cuts are severely grating. The construction is lackluster and the content is garbage. It appears you have brought us a piece of the internet to throw into the garbage bin.

[-] YourNetworkIsHaunted@awful.systems 1 points 3 hours ago* (last edited 22 minutes ago)

Yeah. I think there's definitely something interesting here, but it's mostly in how badly compromised the final pproduct ends up being in order to support the AI tools.

[-] TinyTimmyTokyo@awful.systems 2 points 12 hours ago* (last edited 12 hours ago)

How much energy was used to produce that video?

[-] sailor_sega_saturn@awful.systems 6 points 20 hours ago

"The common people pray for anime memes, healthy vtubers, and a wikipedia article that never ends," Ser Jorah told her. "It is no matter to them if the high lords play their game of tweets, so long as they are left in peace." He gave a shrug. "They never are.”

- George R. R. Martin

[-] Soyweiser@awful.systems 5 points 22 hours ago

I'm still thinking about the article about the NRx party from last week and just how classless (who pours champagne wrong?) and sad it showed them to be, while still being obsessed with their image. Such a sad bunch, their ideas have reached the higher ups of American power and still they obsess about how a journalist (who is dating one of them (he is into the 'we live in a simulation', break up with him, you are in danger)) might write something bad about them. (See also how many of these sad sacks got fired/blackballed for just having no internal filter (dressing up gay people as the KKK really?)). The creme de la creme of intellectual thought and they talk and act like a bunch of 4channers. (Yarvin must know this, his shit about how billionaires act must be a bit of projection). I'm talking about this piece: https://archive.ph/gm3Za Sorry to repost it, I just had a 'layer 2 well done' reminder and cringed again, fucking larpers (No shade to people who actually larp, seems fun, just cringe to do it irl).

[-] scruiser@awful.systems 11 points 1 day ago* (last edited 1 day ago)

So... apparently Peter Thiel has taken to co-opting fundamentalist Christian terminology to go after Effective Altruism? At least it seems that way from this EA post (warning, I took psychic damage just skimming the lunacy). As far as I can tell, he's merely co-opting the terminology, Thiel's blather doesn't have any connection to any variant of Christian eschatology (whether mainstream or fundamentalist or even obscure wacky fundamentalist), but of course, the majority of the EAs don't recognize that, or the fact that he is probably targeting them for their (kind of weak to be honest) attempts at getting AI regulated at all, and instead they charitably try to steelman him and figure out if he was a legitimate point. ...I wish they could put a tenth of this effort into understanding leftist thought.

Some of the comments are... okay actually, at least by EA standards, but there are still plenty of people willing to defend Thiel

One comment notes some confusion:

I’m still confused about the overall shape of what Thiel believes.

He’s concerned about the antichrist opposing Jesus during Armageddon. But afaik standard theology says that Jesus will win for certain. And revelation says the world will be in disarray and moral decay when the Second Coming happens.

If chaos is inevitable and necessary for Jesus’ return, why is expanding the pre-apocalyptic era with growth/prosperity so important to him?

Yeah, its because he is simply borrowing Christian Fundamentalists Eschatological terminology... possibly to try to turn the Christofascists against EA?

Someone actually gets it:

I'm dubious Thiel is actually an ally to anyone worried about permanent dictatorship. He has connections to openly anti-democratic neoreactionaries like Curtis Yarvin, he quotes Nazi lawyer and democracy critic Carl Schmitt on how moments of greatness in politics are when you see your enemy as an enemy, and one of the most famous things he ever said is "I no longer believe that freedom and democracy are compatible". Rather I think he is using "totalitarian" to refer to any situation where the government is less economically libertarian than he would like, or "woke" ideas are popular amongst elite tastemakers, even if the polity this is all occurring in is clearly a liberal democracy, not a totalitarian state.

Note this commenter still uses non-confrontational language ("I'm dubious") even when directly calling Thiel out.

The top comment, though, is just like the main post, extending charitability to complete technofascist insanity. (Warning for psychic damage)

Nice post! I am a pretty close follower of the Thiel Cinematic Universe (ie his various interviews, essays, etc)

I think Thiel is also personally quite motivated (understandably) by wanting to avoid death. This obviously relates to a kind of accelerationist take on AI that sets him against EA, but again, there's a deeper philosophical difference here. Classic Yudkowsky essays (and a memorable Bostrom short story, video adaptation here) share this strident anti-death, pro-medical-progress attitude (cryonics, etc), as do some philanthropists like Vitalik Buterin. But these days, you don't hear so much about "FDA delenda est" or anti-aging research from effective altruism. Perhaps there are valid reasons for this (low tractability, perhaps). But some of the arguments given by EAs against aging's importance are a little weak, IMO (more on this later) -- in Thiel's view, maybe suspiciously weak. This is a weird thing to say, but I think to Thiel, EA looks like a fundamentally statist / fascist ideology, insofar as it is seeking to place the state in a position of central importance, with human individuality / agency / consciousness pushed aside.

As for my personal take on Thiel's views -- I'm often disappointed at the sloppiness (blunt-ness? or low-decoupling-ness?) of his criticisms, which attack the EA for having a problematic "vibe" and political alignment, but without digging into any specific technical points of disagreement. But I do think some of his higher-level, vibe-based critiques have a point.

[-] corbin@awful.systems 6 points 17 hours ago* (last edited 17 hours ago)

Thiel is a true believer in Jesus and God. He was raised evangelical. The quirky eschatologist that you're looking for is René Girard, who he personally met at some point. For more details, check out the Behind the Bastards on him.

Edit: I wrote this before clicking on the LW post. This is a decent summary of Girard's claims as well as how they influence Thiel. I'm quoting West here in order to sneer at Thiel:

Unfortunately (?), Christian society does not let us sacrifice random scapegoats, so we are trapped in an ever-escalating cycle, with only poor substitutes like “cancelling celebrities on Twitter” to release pressure. Girard doesn’t know what to do about this.

Thiel knows what to do about this. After all, he funded Bollea v. Gawker. Instead of letting journalists cancel celebrities, why not cancel journalists instead? Then there's no longer any journalists to do any cancellation! Similarly, Thiel is confirmed to be a source of funding for Eric Weinstein and believed to fund Sabine Hossenfelder. Instead of letting scientists cancel religious beliefs, why not cancel scientists instead? By directing money through folks with existing social legitimacy, Thiel applies mimesis: pretend to be legitimate and you can shift what is legitimate.

In this context, Thiel fears the spectre of AGI because it can't be influenced by his normal approach to power, which is to hide anything that can be hidden and outspend everybody else talking in the open. After all, if AGI is truly to unify humanity, it must unify our moralities and cultures into a single uniformly-acceptable code of conduct. But the only acceptable unification for Thiel is the holistic catholic apostolic one-and-only forever-and-ever church of Jesus, and if AGI is against that then AGI is against Jesus himself.

[-] blakestacey@awful.systems 4 points 12 hours ago

Is there any more solid evidence of Hossenfelder taking Thielbux, or is this just a guess based on the orbit she moves in: appearing on Michael Shermer's podcast years after the news broke that he was a sex pest, blurbing the new book edited by sex pest Lawrence Krauss, etc.

[-] scruiser@awful.systems 2 points 10 hours ago* (last edited 10 hours ago)

The quirky eschatologist that you’re looking for is René Girard, who he personally met at some point. For more details, check out the Behind the Bastards on him.

Thanks for the references. The quirky theology was so outside the range of even the weirder Fundamentalist Christian stuff I didn't recognize it as such. (And didn't trust the EA summary because they try so hard to charitably make sense of Thiel).

In this context, Thiel fears the spectre of AGI because it can’t be influenced by his normal approach to power, which is to hide anything that can be hidden and outspend everybody else talking in the open.

Except the EAs are, on net, opposed to the creation of AGI (albeit they are ineffectual in their opposition). So going after the EAs doesn't make sense if Thiel is genuinely opposed to inventing AGI faster. So I still think Thiel is just going after the EA's because he's libertarian and EA has shifted in the direction of trying to get more government regulation. (As opposed to a coherent theological goal beyond libertarianism). I'll check out the BtB podcast and see if it changes my mind as to his exact flavor of insanity.

[-] Soyweiser@awful.systems 8 points 1 day ago* (last edited 1 day ago)

Yeah, its because he is simply borrowing Christian Fundamentalists Eschatological terminology… possibly to try to turn the Christofascists against EA?

Yep, the usefulness of EA is over, they are next on the chopping block. I'd imagine a similar thing will happen to redscare/moldbug if they ever speak out against him.

E: And why would a rich guy be against a "we are trying to convince rich guys to spend their money differently" organization. Esp a 'libertarian' "I get to do what I want or else" one.

[-] scruiser@awful.systems 2 points 10 hours ago

And why would a rich guy be against a “we are trying to convince rich guys to spend their money differently” organization.

Well when they are just passively trying to convince the rich guys, they can use the organization to launder reputation or boost ideologies they are in favor of. When the organization actually tries to get regulations passed, even ineffectually, well, that is a threat to the likes of Thiel.

[-] gerikson@awful.systems 6 points 23 hours ago

It always struck me as hilarious that the EA/LW crowd could ever affect policy in any way. They're cosplaying as activists, have no ideas about how to move the public image needle other than weird movie ideas and hope, and are literally marinated in SV technolibertarianism which sees government regulation as Evil.

There's a mini-freakout over OpenAI deciding to keep GPT-4o active, despite it being more "sycophantic" than GPT-5 (and thus more likely to convince people to do Bad Things) but there's also the queasy realization that if sycophantic LLMs is what brings in the bucks, nothing is gonna stop LLM companies from offering them. And there's no way these people can stop it, because they've made the deal that LLM companies are gonna be the ones realizing that AI is gonna kill everyone and that's never gonna happen.

[-] scruiser@awful.systems 3 points 10 hours ago

They’re cosplaying as activists, have no ideas about how to move the public image needle other than weird movie ideas and hope, and are literally marinated in SV technolibertarianism which sees government regulation as Evil.

It is kind of sad. They are missing the ideological pieces that would let them carry out activism effectually so instead they've gotten used as a free source of crit-hype in the LLM bubble. ...except not that sad because they would ignore real AI dangers in favor of their sci-fi scenarios, so I don't feel too bad for them.

[-] o7___o7@awful.systems 2 points 7 hours ago* (last edited 7 hours ago)

Brian Merchant's article about that lighthaven gathering really struck me.

The men who EAs think will end the earth were in the building with them, and rather than organize to throw them out a window (or even to just make them mildly uncomfortable), the bayes knowers all gormlessly moped around their twee boutique hotel and cried around some whiteboards.

Absolute hellish brainworms

[-] DonPiano@feddit.org 1 points 2 hours ago

Ooh, do you have a link to share?

[-] gerikson@awful.systems 6 points 1 day ago

Using the term "Antichrist" as a shorthand for "global stable totalitarianism" is A Choice.

[-] JFranek@awful.systems 5 points 23 hours ago

I think Leathery Pete might have read too much Left Behind.

[-] istewart@awful.systems 11 points 1 day ago

tl,dr; Thiel now sees the Christofascists as a more durable grifting base than the EAs, and is looking to change lanes while the temporary coalitions of maximalist Trumpism offer him the opportunity.

I repeat my suspicion that Thiel is not any more sober than Musk, he's just getting sloppier about keeping it out of the public eye.

[-] zogwarg@awful.systems 8 points 1 day ago

I think a big difference between Thiel and Musk, is that Thiel views himself as an "intellectual" and derives prestige "intellectualism". I don't believe for a minute he's genuinely christian, but his wankery about end-of-times eschatology of armageddon = big-left-government, is a a bit too confused to be purely cynical, I think sniffing his own farts feeds his ego.

Of course a man who would promote open doping olympics isn't sober.

load more comments
view more: next ›
this post was submitted on 11 Aug 2025
19 points (100.0% liked)

TechTakes

2108 readers
88 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS