20

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[-] YourNetworkIsHaunted@awful.systems 16 points 2 weeks ago

Sneer inspired by a thread on the preferred Tumblr aggregator subreddit.

Rationalists found out that human behavior didn't match their ideological model, then rather than abandon their model or change their ideology decided to replace humanity with AIs designed to behave the way they think humans should, just as soon as they can figure out a way to do that without them destroying all life in the universe.

load more comments (2 replies)
[-] mawhrin@awful.systems 15 points 1 week ago* (last edited 1 week ago)

david heinemeier hanson of the ruby on rails fame decided to post a white supremacist screed with a side of transphobia because now he doesn't need to pretend anything anymore. it's not surprising, he was heading this way for a while, but seeing the naked apology of fascism is still shocking for me.

any reasonable open source project he participates in should immediately cut ties with the fucker. (i'm not holding my breath waiting, though.)

[-] fishidwardrobe@mastodon.me.uk 2 points 6 days ago* (last edited 6 days ago)

@mawhrin @BlueMonday1984 bad news, he's just financed a coup of Rubygems…

(well, shopify did. he's on the board. so, looks like, best guess.)

(edit: fixed wrong company name.)

[-] nightsky@awful.systems 12 points 1 week ago* (last edited 1 week ago)

Urgh, I couldn't even get through the whole article, it's too disgusting. What a surprise that yet another "no politics at work"-guy turns out to support fascism!

load more comments (3 replies)
[-] BlueMonday1984@awful.systems 14 points 2 weeks ago

Starting things off with a newsletter by Jared White that caught my attention: Why “Normies” Hate Programmers and the End of the Playful Hacker Trope, which directly discusses how the public perception of programmers has changed for the worse, and how best to rehabilitate it.

Adding my own two cents, the rise of gen-AI has definitely played a role here - I'm gonna quote Baldur Bjarnason directly here, since he said it better than I could:

[-] istewart@awful.systems 14 points 2 weeks ago

This is an interesting crystallization that parallels a lot of thoughts I've been having, and it's particularly hopeful that it seeks to discard the "hacker" moniker and instead specifically describe the subjects as programmers. Looking back, I was only becoming terminally online circa 1997, and back then it seemed like there was an across-the-spectrum effort to reclaim the term "hacker" into a positive connotation after the federal prosecutions of the early 90s. People from aspirant-executive types like Paul Graham to dirty hippies like RMS were insistent that being a "hacker" was a good thing, maybe the best possible thing. This was, of course, a dead letter as soon as Facebook set up at "One Hacker Way" in Menlo Park, but I'd say it's definitely for the best to finally put a solid tombstone on top of that cultural impulse.

As well, because my understanding of the defining activity of the positive-good "hacker" is that it's all too close to Zuckerberg's "move fast and break things," and I think Jared White would probably agree with me. Paul Graham was willing to embrace the term because he was used to the interactive development style of Lisp environments, but the mainstream tools have only fitfully evolved in that direction at best. When "hacking," the "hacker" makes a series of short, small iterations with a mostly nebulous goal in mind, and the bulk of the effort may actually be what's invested in the minimum viable product. The self-conception inherits from geek culture a slumped posture of almost permanent insufficiency, perhaps hiding a Straussian victimhood complex to justify maintaining one's own otherness.

In mentioning Jobs, the piece gestures towards the important cultural distinction that I still think is underexamined. If we're going to reclaim and rehabilitate even homeopathic amounts of Jobs' reputation, the thesis we're trying to get at is that his conception of computers as human tools is directly at odds with the AI promoters' (and, more broadly, most cloud vendors') conception of computers as separate entities. The development of generative AI is only loosely connected with the sanitized smiley-face conception of "hacking." The sheer amount of resources and time spent on training foreclose the possibility of a rapid development loop, and you're still not guaranteed viable output at the end. Your "hacks" can devolve into a complete mess, and at eye-watering expense.

I went and skimmed Graham's Hackers and Painters again to see if I could find any choice quotes along these lines, since he spends that entire essay overdosing on the virtuosity of the "hacker." And hoo boy:

Measuring what hackers are actually trying to do, designing beautiful software, would be much more difficult. You need a good sense of design to judge good design. And there is no correlation, except possibly a negative one, between people's ability to recognize good design and their confidence that they can.

You think Graham will ever realize that we're culminating a generation of his precious "hackers" who ultimately failed at all this?

[-] mirrorwitch@awful.systems 9 points 2 weeks ago

re: last line: no, he never will admit or concede to a single damn thing, and that's why every time I remember this article exists I have to reread dabblers & blowhards one more time purely for defensive catharsis

load more comments (1 replies)
load more comments (2 replies)
[-] CinnasVerses@awful.systems 12 points 2 weeks ago* (last edited 2 weeks ago)

AFAIK the USA is the only country where programmers make very high wages compared to other college-educated people in a profession anyone can enter. Its a myth that so-called STEM majors earn much more than others, although people with a professional degree often launch their careers quicker than people without (but if you really want to launch your career quickly, learn a trade or work in an extractive industry somewhere remote). So I think for a long time programmers in the USA made peace with FAANG because they got a share of the booty.

load more comments (1 replies)
[-] Soyweiser@awful.systems 9 points 2 weeks ago* (last edited 2 weeks ago)

Hackers is dead. (Apologies to punk)

Id say that for one reason alone, when Musk claimed grok was from the guide nobody really turned on him.

Unrelated to programmers or hackers, Elons father (CW: racism) went fully mask off and claims Elon agrees with him. Which considering his promotion of the UK racists does not feel off the mark. (And he is spreading the dumb '[Africans] have an [average] IQ of 63' shit, and claims it is all genetic. Sure man, the average African needs help understanding the business end of a hammer. As I said before, guess I met the smartest Africans in the world then, as my university had a few smart exchange students from an African country. If you look at his statements it is even dumber than normal, as he says population, so that means either non-Black Africans are not included, showing just how much he thinks of himself as the other, or they are, and the Black African average is even lower).

[-] dgerard@awful.systems 14 points 1 week ago* (last edited 1 week ago)

the talking point about disparaging terms for AI users by choice "I came up with a racist-sounding term for AI users, so if you say 'clanker' you must be a racist" is so fucking stupid it's gotta be some sort of op

(esp when the made-up racist-sounding term turns out to have originated with Warren fucking Ellis)

i am extremely disappointed that awful systems users have fallen for it for a moment

load more comments (4 replies)
[-] blakestacey@awful.systems 14 points 2 weeks ago

Regarding occasional sneer target Lawrence Krauss and his co-conspirators:

Months of waiting but my review copy of The War on Science has arrived.

I read Krauss’ introduction. What the fuck happened to this man? He comes off as incapable of basic research, argument, basic scholarship. [...] Um... I think I found the bibliography: it's a pdf on Krauss' website? And all the essays use different citation formats?

Most of the essays don't include any citations in the text but some have accompanying bibliographies?

I think I'm going insane here.

What the fuck?

https://bsky.app/profile/nateo.bsky.social/post/3lyuzaaj76s2o

[-] nightsky@awful.systems 18 points 2 weeks ago

Huh, I wonder who this Krauss guy is, haven't heard of him.

*open wikipedia*

*entire subsection titled "Allegations of sexual misconduct"*

*close wikipedia*

[-] blakestacey@awful.systems 15 points 2 weeks ago

image descriptionScreenshot of Lawrence Krauss's Wikipedia article, showing a section called "Controversies" with subheadings "Relationship with Jeffrey Epstein" followed by "Allegations of sexual misconduct". Text at https://en.wikipedia.org/wiki/Lawrence_Krauss#Controversies

load more comments (2 replies)
[-] V0ldek@awful.systems 14 points 2 weeks ago

All of those people, Krauss, Dawkins, Harris (okay that one might've been unsalvageable from the start, I'm really not sure) are such a great reminder that you can be however smart/educated you want, the moment you believe you're the smartest boi and stop learning and critically approaching your own output you get sucked into the black hole of your asshole, never to return.

Like if I had a nickel. It's hubris every time. All of those people need just a single good friend that, from time to time, would tell them "man, what you said was really fucking stupid just now" and they'd be saved.

Clout is a proxy of power and power just absolutely rots your fucking brain. Every time a Guy emerges, becomes popular, clearly thinks "haha, but I am different, power will not rot MY brain", five years later boom, he's drinking with Jordan Benzo Peterson. Even Joe Fucking Rogan used to be significantly more lucid before someone gave him ten bazillion dollars for a podcast and he suffered severe clout poisoning.

[-] swlabr@awful.systems 13 points 2 weeks ago

OK. So, this next thing is pretty much completely out of the sneerosphere, but it pattern matches to what we’re used to looking at: a self-styled “science communicator” mansplaining a topic they only have a reductive understanding of: Hank Green gets called out for bad knitting video

archive

[-] V0ldek@awful.systems 11 points 2 weeks ago

TIL Hank Green, the milquetoast BlueSky poster, also has some YouTube channel. How quaint.

I think every time I learn That Guy From BlueSky also has some other gig different from posting silly memes I lose some respect for them.

E.g. I thought Mark Cuban was just a dumb libertarian shitposter, but then it turned out he has a cuntillion dollars and also participated in a show unironically called "Shark Tank" that I still don't 100% believe was a real thing because by god

load more comments (3 replies)
load more comments (3 replies)
[-] Soyweiser@awful.systems 13 points 1 week ago

Angela Collier: Dyson spheres are a joke.

spoilerTurns out Dyson agreed.

load more comments (1 replies)
[-] Architeuthis@awful.systems 13 points 1 week ago

Sabine Hossenfelder claims she finally got cancelled, kind of - Munich Center for Mathematical Philosophy cut ties with Sabine Hossenfelder.

Supposedly the MCMP thought publicly shitting on a paper for clicks on your very popular youtube channel was antideontological. Link goes to reddit post in case you don't want to give her views.

[-] CinnasVerses@awful.systems 11 points 1 week ago* (last edited 1 week ago)

The commentator who thinks that USD 120k / year is a poor income for someone with a PhD makes me sad. That is what you earn if you become a professor of physics at a research university or get a good postdoc, but she aged out of all of those jobs and was stuck on poorly paid short-term contracts. There are lots of well-paid things that someone with a PhD in physics can do if she is willing to network and work for it, but she chose "rogue intellectual."

A German term to look up is WissZeitVG but many academic jobs in many countries are only offered to people no more than x years after receiving their PhD (yep, this discriminates against women and the disabled and those with sick spouses or parents).

load more comments (3 replies)
[-] rook@awful.systems 12 points 2 weeks ago

Woke up to some hashtag spam this morning

AI’s Biggest Security Threat May Be Quantum Decryption

which appears to be over of those evolutionary “transitional forms” between grifts.

The sad thing is the underlying point is almost sound (hoarding data puts you at risk of data breaches, and leaking sensitive data might be Very Bad Indeed) but it is wrapped up in so much overhyped nonsense it is barely visible. Naturally, the best and most obvious fix — don’t hoard all that shit in the first place — wasn’t suggested.

(it also appears to be a month-old story, but I guess there’s no reason for mastodon hashtag spammers to be current 🫤)

load more comments (11 replies)
[-] PMMeYourJerkyRecipes@awful.systems 12 points 2 weeks ago

Getting pretty far afield here, but goddamn Matt Yglesias's new magazine sucks:

The case for affirmative action for conservatives

"If we cave in and give the right exactly what they want on this issue, they'll finally be nice to us! Sure, you might think based on the last 50,000 times we've tried this strategy that they'll just move the goalposts and demand further concessions, but then they'll totally look like hypocrites and we'll win the moral victory, which is what actually matters!"

[-] jonhendry@iosdev.space 10 points 2 weeks ago

@PMMeYourJerkyRecipes @BlueMonday1984

The guy from the Federalist *doesn’t* want more ideological diversity in academia, he wants *less*. But he’ll settle for more as an interim goal until he can purge the wrong-thinkers.

load more comments (1 replies)
[-] swlabr@awful.systems 12 points 1 week ago

If Yud and Soares are on a book tour I want them to go on hot ones

load more comments (10 replies)
[-] corbin@awful.systems 11 points 1 week ago

Some of our younger readers might not be fully inoculated against high-control language. Fortunately, cult analyst Amanda Montell is on Crash Course this week with a 45min lecture introducing the dynamics of cult linguistics. For example, describing Synanon attack therapy, Youtube comments, doomscrolling, and maybe a familiar watering hole or two:

You know when people can't stop posting negative or conspiratorial comments, thinking they're calling someone out for some moral infraction, when really they're just aiming for clout and maybe catharsis?

load more comments (1 replies)
[-] corbin@awful.systems 10 points 2 weeks ago

Wolfram has a blog post about lambda calculus. As usual, there are no citations and the bibliography is for the wrong blog post and missing many important foundational papers. There are no new results in this blog post (and IMO barely anything interesting) and it's mostly accurate, so it's okay to share the pretty pictures with friends as long as the reader keeps in mind that the author is writing to glorify themselves and make drawings rather than to communicate the essential facts or conduct peer review. I will award partial credit for citing John Tromp's effort in defining these diagrams, although Wolfram ignores that Tromp and an entire community of online enthusiasts have been studying them for decades. But yeah, it's a Mathematica ad.

In which I am pedantic about computer science (but also where I'm putting most of my sneers too, including a punchline)

For example, Wolfram's wrong that every closed lambda term corresponds to a combinator; it's a reasonable assumption that turns out to not make sense upon closer inspection. It's okay, because I know that he was just quoting the same 1992 paper by Fokker that I cited when writing the esolangs page for closed lambda terms, which has the same incorrect claim verbatim as its first sentence. Also, credit to Wolfram for listing Fokker in the bibliography; this is one of the foundational papers that we'd expect to see. With that in mind, here's some differences between my article and his.

The name "Fokker" appears over a dozen times in my article and nowhere in Wolfram's article. Also, I love being citogenic and my article is the origin of the phrase "Fokker size". I think that this is a big miss on his part because he can't envision a future where somebody says something like "The Fokker metric space" or "enriched over Fokker size". I've already written "some closed lambda terms with small Fokker size" in the public domain and it's only a matter of time until Zipf's law wears it down to "some small Fokkers".

Also, while "Tromp" only appears once in my article, it appears next to somebody known only as "mtve" when they collaborated to produce what Wolfram calls a "size-7 lambda" known as Alpha. I love little results like these which aren't formally published and only exist on community wikis. Would have been pretty fascinating if Alpha were complete, wouldn't it Steve!? Would have merited a mention of progress in the community amongst small lambda terms, huh Steve!?

I also checked the BB Gauge for Binary Lambda Calculus (BLC), since it's one of the topics I already wrote up, and found that Wolfram's completely omitted Felgenhauer from the picture too, with that name in neither the text nor bibliography. Felgenhauer's made about as many constructions in BLC as Tromp; Felgenhauer 2014 constructs that Goodstein sequence, for example. Also, Wolfram didn't write that sequence, they sourced it from a living paper not in the bibliography, written by…Felgenhauer! So it's yet another case of Wolfram just handily choosing to omit a name from a decade-old result in the hopes that somebody will prefer his new presentation to the old one.

Finally, what's the point of all this? I think Wolfram writes these posts to advertise Mathematica (which is actually called Wolfram Mathematica and uses a programming language called Wolfram BuT DiD YoU KnOw) He also promotes his attempt at rewriting all of physics to have his logo upon it, and this blog post is a gateway to that project in the sense that Wolfram genuinely believes that staring at these chaotic geometries will reveal the equations of divine nature. Meanwhile I wrote my article in order to ~~win an IRC argument against~~ make a reasonable presentation of an interesting phenomenon in computer science directly to Felgenhauer & Tromp, and while they don't fully agree with me, we together can't disagree with what's presented in the article. That's peer review, right?

load more comments (1 replies)
[-] BlueMonday1984@awful.systems 9 points 1 week ago

The billionaires' dreams of defeating death with technology have been "realised" by Marvel, which is planning an AI-Powered^tm^ hologram of him at L.A. Comic Con.

To the shock of nobody, this act of exploitation through digital necromancy is being met with unfiltered disgust.

load more comments (1 replies)
[-] BlueMonday1984@awful.systems 9 points 2 weeks ago

New post from tante: The “Data” Narrative eats itself, using the latest Pivot to AI as a jumping off point to talk about synthetic data.

[-] rook@awful.systems 9 points 1 week ago* (last edited 1 week ago)

Haven’t read the source paper yet (apparently it came out two weeks ago, maybe it already got sneered?) but this might be fun: OpenAI admits AI hallucinations are mathematically inevitable, not just engineering flaws.

Full of little gems like

Beyond proving hallucinations were inevitable, the OpenAI research revealed that industry evaluation methods actively encouraged the problem. Analysis of popular benchmarks, including GPQA, MMLU-Pro, and SWE-bench, found nine out of 10 major evaluations used binary grading that penalized “I don’t know” responses while rewarding incorrect but confident answers.

I had assumed that the problem was solely technical, that the fundamental design of LLMs meant that they’d always generate bullshit, but it hadn’t occurred to me that the developers actively selected for bullshit generation.

It seems kinda obvious in retrospect… slick bullshit extrusion is very much what is selling “AI” to upper management.

[-] dgerard@awful.systems 2 points 3 days ago

it's a shitty paper but even they couldn't avoid the point forever

load more comments (1 replies)
[-] BlueMonday1984@awful.systems 9 points 2 weeks ago* (last edited 2 weeks ago)

New edition of AI Killed My Job, giving a deep dive into how genAI has hurt artists. I'd like to bring particular attention to Meilssa's story, which is roughly halfway through, specifically the ending:

There's a part of me that will never forgive the tech industry for what they've taken from me and what they've chosen to do with it. In the early days as the dawning horror set in, I cried about this almost every day. I wondered if I should quit making art. I contemplated suicide. I did nothing to these people, but every day I have to see them gleefully cheer online about the anticipated death of my chosen profession. I had no idea we artists were so hated—I still don't know why. What did my silly little cat drawings do to earn so much contempt? That part is probably one of the hardest consequences of AI to come to terms with. It didn't just try to take my job (or succeed in making my job worse) it exposed a whole lot of people who hate me and everything I am for reasons I can't fathom. They want to exploit me and see me eradicated at the same time.

load more comments
view more: next ›
this post was submitted on 14 Sep 2025
20 points (100.0% liked)

TechTakes

2183 readers
62 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS