18

Hello everybody, after a lengthy delay, my talk to the University of Sidney about Neoreaction and the ways I tried to map its various communities, is now available.

Please ignore the coughing. My keynote slides were very dusty.

31

I want to make the case to my employer that we should drop Twitter/X as a promotional channel. I could go drawing together the various examples of disinfo spreading, instating CSAM posters, rise of content inciting violence etc, but I thought I'd check to see if someone hasn't already been tracking this. The sooner I can pull the info together the better but I don't have time right now to go compiling it myself.

Anyone know if there's a site, wiki, resource, thread etc that could set me up?

[-] UnseriousAcademic@awful.systems 18 points 2 months ago* (last edited 2 months ago)

Good to see some reporting that continues to gloss over Srinivasan's obsession with culture war "woke ess" and writing that veers a little too close to the idea of purging undesirables. Also missing out on my favourite: the open musing that the next best steps might be to corrupt the police force through financial incentives so they can have them as their own private security force to take over San Francisco.

[-] UnseriousAcademic@awful.systems 35 points 2 months ago

Man I feel this, particularly the sudden shutting down of data access because all the platforms want OpenAI money. I spent three years building a tool that pulled follower relation data from Twitter and exponentially crawled it's way outwards from a few seed accounts to millions of users. Using that data it was able to make a compressed summary network, identify community structures, give names to the communities based on words in user profiles, and then use sampled tweet data to tell us the extent to which different communities interacted.

I spent 8 months in ethics committees to get approval to do it, I got a prototype working, but rather than just publish I wanted to make it accessible to the academic community so I spent even more time building an interface, making it user friendly, improving performance, making it more stable etc.

I wanted to ensure that when we published our results I could also say "here is this method we've developed, and here you can test it and use it too for free, even if you don't know how to code". Some people at my institution wanted me to explore commercialising but I always intended to go open source. I'm not a professional developer by any means so the project was always going to be a janky academic thing, but it worked for our purposes and was a new way of working with social media data to ask questions that couldn't be answered before.

Then the API got put behind a $48K a month paywall and the project was dead. Then everywhere else started shutting their doors too. I don't do social media research anymore.

22

The benefits of crypto are self evident, thus it is necessary to build an elaborate faux education system to demonstrate them.

I'm sure there will also be some Network Fascism in there for good measure.

[-] UnseriousAcademic@awful.systems 15 points 2 months ago

Promptfondler sounds like an Aphex Twin song title.

38

Revered friends. I wrote a thing. Mainly because I had a stack of stuff on Joseph Weizenbaum on tap and the AI classroom thing was stuck in my head. I don't know if it's good, but it's certainly written.

[-] UnseriousAcademic@awful.systems 33 points 2 months ago

The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as "Learning facilitators". Apparently former teachers have rejoined the school in these new roles.

Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don't need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.

53
[-] UnseriousAcademic@awful.systems 14 points 2 months ago

Does this mean they're not going to bother training a whole new model again? I was looking forward to seeing AI Mad Cow Disease after it consumed an Internet's worth of AI generated content.

[-] UnseriousAcademic@awful.systems 24 points 3 months ago

My most charitable interpretation of this is that he, like a lot of people, doesn't understand AI in the slightest. He treated it like Google, asked for some of the most negative quotes from movie critics for past Coppola films and the AI hallucinated some for him.

If true it's a great example of why AI is actually worse for information retrieval than a basic vector based search engine.

19
The Politics of Urbit (journals.sagepub.com)

With Yarvin renewing interest in Urbit I was reminded of this paper that focuses on Urbit as a representation of the politics of "exit". It's free/open access if anyone is interested.

From the abstract...

This paper examines the impact of neoreactionary (NRx) thinking – that of Curtis Yarvin, Nick Land, Peter Thiel and Patri Friedman in particular – on contemporary political debates manifest in ‘architectures of exit’...While technological programmes such as Urbit may never ultimately succeed, we argue that these, and other speculative investments such as ‘seasteading’, reflect broader post-neoliberal NRx imaginaries that were, perhaps, prefigured a quarter of a century ago in The Sovereign Individual."

23

Hello all. People were very kind when I originally posted the start of this series. I've refrained from spamming you with every part but I thought I'd post to say the very final installment is done.

I got a bit weird with it this time as I felt like I had an infinite amount to say, all of which only barely got to the underlying point i was trying to make. So much that I wrote I also cut, it's ridiculous.

Anyway now the series is done I'm going to move on to smaller discrete pieces as I work on my book about Tech Culture's propensity to far-right politics. I'll be dropping interesting stuff I find, examples of Right Libertarians saying ridiculous things, so follow along if that's your jam.

[-] UnseriousAcademic@awful.systems 37 points 3 months ago

I feel like generative AI is an indicator of a broader pattern of innovation in stagnation (shower thoughts here, I'm not bringing sources to this game).

I was just a little while ago wondering if there is an argument to be made that the innovations of the post-war period were far more radically and beneficially transformative to most people. Stuff like accessible dishwashers, home tools, better home refrigeration etc. I feel like now tech is just here to make things worse. I can't think of any upcoming or recent home tech product that I'm remotely excited about.

[-] UnseriousAcademic@awful.systems 22 points 3 months ago

I'm banking on the primary use case being "getting Elon sued into oblivion by Disney" .

[-] UnseriousAcademic@awful.systems 14 points 3 months ago

Fascinating to see that the politics of the old crypto hype train have carried over to the new hype train.

163

The cost of simply retrieving an answer from the Web is infinitely smaller than the cost of generating a new one.

Great interview with Sasha Luccioni from Huggingface on all the ways that using generative AI for everything is both a) hugely costly compared to existing methods, and b) insane.

[-] UnseriousAcademic@awful.systems 54 points 4 months ago

If this includes their journals then I guess my stuff is off to the big LLM melting pot to be regurgitated wrongly without context or attribution.

[-] UnseriousAcademic@awful.systems 16 points 4 months ago

Absolutely. In fact in one major survey of the values of the counterculture conducted back in the 1960s Ayn Rand was listed as one of people's major influences. There were different strands to the counterculture, one communitarian but the other about self actualisation and the individual. Both positioned themselves in opposition to the state, but differed significantly in what kind of future they wanted.

[-] UnseriousAcademic@awful.systems 30 points 4 months ago* (last edited 4 months ago)

John Ganz did a good coverage of the ideological side of tech, particularly using Herf's book Reactionary Modernism that looks at the role of engineers in building Nazi ideology.

You can read Reactionary Modernism for free on the Internet Archive

175

Seeing a sudden surge in interest in the "Tech Right" as they're being dubbed. Often the focus is on business motivations like tax breaks but I think there's more to it. The narrative that silicon Valley is a bunch of tech hippies was well sown early on, particularly by Stewart Brand and his ilk but throughout that period and prior, the intersection between tech and authoritative politics that favours systems over people is well established.

34
Devs and the Culture of Tech (unserious.substack.com)

Hello all,

TLDR: I've written some stuff about tech ideology via the TV show Devs. It's all free, no paid subs etc. Would love it if anyone interested wanted to take a look - link is to my blog.

Longer blurb: Firstly if this is severely poor form please tell me to do one, throw tomatoes etc.

I'm a Sociologist that focuses on tech culture. Particularly elite tech culture and the far right. I started off writing about the piracy cultures of the 2000s and their role in the switch to digital distribution back in 2013. Just by virtue of paying attention to tech ideology I've now ended up also researching far right extremism and radicalisation and do a lot of data analysis with antifacist orgs. I also used to flirt around in the Sneerclub post-rat spaces on reddit and twitter a few years back too.

Anyway, I've been researching NRx and the wider fashy nature of tech since 2016 but because of "issues" I've not yet got much out into the world. I'm working on a book that more closely examines the way that the history and ideologies in tech culture play well to far right extremism and what it might say about the process of radicalisation more generally.

However, because I'm tired of glacial academic publishing timelines I've also started a research blog called Unserious Academic and for my first project I use the Alex Garland TV show Devs to illustrate and explore some of the things I know about tech culture. I've put out three parts so far with a fourth one ready for Monday. I'm not looking for paid subs or anything, all free I just figured some people might be interested.

I also desperately need a place where people know what a neoreactionary is so I can more easily complain about them so I'd like to hang around longer term too. Thanks for your time!

view more: next ›

UnseriousAcademic

joined 4 months ago