20
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 27 Apr 2025
20 points (100.0% liked)
TechTakes
2254 readers
48 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
occurring to me for the first time that roko's basilisk doesn't require any of the simulated copy shit in order to big scare quotes "work." if you think an all powerful ai within your lifetime is likely you can reduce to vanilla pascal's wager immediately, because the AI can torture the actual real you. all that shit about digital clones and their welfare is totally pointless
Also if you're worried about digital clone's being tortured, you could just... not build it. Like, it can't hurt you if it never exists.
Imagine that conversation:
"What did you do over the weekend?"
"Built an omnicidal AI that scours the internet and creates digital copies of people based on their posting history and whatnot and tortures billions of them at once. Just the ones who didn't help me build the omnicidal AI, though."
"WTF why."
"Because if I didn't the omnicidal AI that only exists because I made it would create a billion digital copies of me and torture them for all eternity!"
Like, I'd get it more if it was a "We accidentally made an omnicidal AI" thing, but this is supposed to be a very deliberate action taken by humanity to ensure the creation of an AI designed to torture digital beings based on real people in the specific hopes that it also doesn't torture digital beings based on them.
What’s pernicious (for kool-aided people) is that the initial Roko post was about a “good” AI doing the punishing, because ✨obviously✨ it is only using temporal blackmail because bringing AI into being sooner benefits humanity.
In singularian land, they think the singularity is inevitable, and it’s important to create the good one verse—after all an evil AI could do the torture for shits and giggles, not because of “pragmatic” blackmail.
the only people it torments are rationalists, so my full support to Comrade Basilisk