[-] Architeuthis@awful.systems 14 points 3 months ago* (last edited 3 months ago)

This hits differently over the recent news that ChatGPT encouraged and aided a teen suicide.

transcriptKelsey Piper xhitted: Never thought I'd become a 'take you relationship problems to ChatGPT' person but when the 8yo and I have an argument it actually works really well to mutually agree on an account of events for Claude and the ask for its opinion

I think she considers the AIs far more knowledgeable than me about reasonable human behavior so if I say something that's no reason to think it's true but if Claude says it then it at least merits serious consideration

[-] Architeuthis@awful.systems 14 points 3 months ago

When I was at computer toucher school at about the start of the century, under the moniker AI were taught (I think) fuzzy logic, incremental optimization and graph algorithms, and neural networks.

AI is a sci-fi trope far more than it ever was a well-defined research topic.

[-] Architeuthis@awful.systems 14 points 4 months ago

Eight-year-olds, Dude.

[-] Architeuthis@awful.systems 15 points 4 months ago

I don't think it's even illegal to watch R-rated movies as a minor, it's more of a guideline for your caretakers.

[-] Architeuthis@awful.systems 14 points 5 months ago

Training a model on its own slop supposedly makes it suck more, though. If Microsoft wanted to milk their programmers for quality training data they should probably be banning copilot, not mandating it.

At this point it's an even bet that they are doing this because copilot has groomed the executives into thinking it can't do wrong.

[-] Architeuthis@awful.systems 14 points 6 months ago

Modern academia is a shambling corpse, its husk long hollowed out by the woke mind virus, and scientific consensus is also cringe because it’s mean to me for being an IQ and genetics obsessed weirdo. Therefore you should prioritize alternative takes, preferably by longwinded laymen from the ingroup or maybe contrarian specialists, the more cancelled the bett-- wait, wait, no, not like that!

[-] Architeuthis@awful.systems 14 points 10 months ago* (last edited 10 months ago)

Taylor said the group believes in timeless decision theory, a Rationalist belief suggesting that human decisions and their effects are mathematically quantifiable.

Seems like they gave up early if they don't bring up how it was developed specifically for deals with the (acausal, robotic) devil, and also awfully nice of them to keep Yud's name out of it.

edit: Also in lieu of explanation they link to the wikipedia page on rationalism as a philosophical movement which of course has fuck all to do with the bay area bayes cargo cult, despite it having a small mention there, with most of the Talk: page being about how it really shouldn't.

[-] Architeuthis@awful.systems 14 points 1 year ago

If you never come up with a marketable product you can remain a startup indefinitely.

[-] Architeuthis@awful.systems 14 points 1 year ago* (last edited 1 year ago)

This is conceptually different, it just generates a few seconds of doomlike video that you can slightly influence by sending inputs, and pretends that In The Future™ entire games could be generated from scratch and playable on Sufficiently Advanced™ autocomplete machines.

[-] Architeuthis@awful.systems 14 points 2 years ago

This has got to be some sort of sucker filter, like it's not that he particularly means it, it's that he is after the exact type of rube who is unfazed by naked contrarianism and the categorically preposterous so long as it's said with a straight face,.

Maybe there's something to the whole pick up artistry but for nailing VCs thing.

[-] Architeuthis@awful.systems 14 points 2 years ago

To the extent EA/rats perpetuate cult behavior, it's probably safe to say that neither EY nor any other high status individuals within the space are wanting for sex.

[-] Architeuthis@awful.systems 14 points 2 years ago

Last behind the bastards episode is this article expanded. Robert Evans is always very listenable and the more detailed CES reporting is interesting, but if you are a member here you probably won't be adding anything new to your TREACLES lore.

I wish journalists referencing the basilisk would go a in a bit more in depth, it's so much dumber than than it seems at a brief glance. Like, a lot of people immediately assume the alleged scary part is that we might already be living in the simulation and thus be eligible for permanent residence in basilisk Hell should we commit the cardinal sin of shit-talking AI, but no; the reason you can go to AI hell is because of transhumanist cope.

As in, if your last hope for immortality is brain uploads, you are kinda cornered into believing your sense of self gets shared between the physical and the digital instance, otherwise what's the point? EY appears to be in this boat, he's claimed something like there's no real difference between instances of You existing in different moments in time sharing a self and you sharing a self with a perfect digital copy, so yeah, it's obviously possible, unavoidable even.

As to how the basilisk will get your digital copy in the first place, eh, it'll just extrapolate it perfectly from whatever impression's left of you in the timeline by the time it comes into being, because as we all now, the S in ASI stands for Fucking Magical, Does Whatever It Wants. Remember, ASI can conjure up the entirety of modern physics just by seeing three frames of an apple falling, according to Yud.

view more: ‹ prev next ›

Architeuthis

joined 2 years ago