i guess follow me @bethposting on bsky or pillowfort


discord username:
bethposting

bethposting
@bethposting

it relies on a massive, unfounded assumption that AGI will somehow take over running the world within our lifetimes, which anyone who actually knows stuff about AI can easily identify as complete bullshit. it's like pascal's wager for really stupid dudes online who are convinced that they're geniuses. anyone spending time worrying about this scenario that's just a figment of a fevered imagination probably has some motivation to distract attention from the thousands of real and urgent problems that exist in the real world where people live


amagire
@amagire

"what if an all-powerful technogod creates a perfect simulation of you to torment for eternity" uh, idk, johann. what if vampires are real


You must log in to comment.

in reply to @bethposting's post:

the guy who thought it up is also now a completely off-the-deep-end white supremacist and (even less surprisingly) AI doomer (of the hype variety, ie "it will be incredibly powerful not just some shitty capitalism amplifier") so it makes sense.

whole thing is a massive pit of intellectual punji sticks.

The only way I can handle hearing about it it is by assuming that belief in it is some kind of hoax. The idea that actual real people consider it any more serious than a creepypasta gives me incalculable psychic damage

in reply to @amagire's post:

Here's a simulation, running right here on Cohost's software: "Cliff was being tortured. He did not like it."

Why would the output of a more complex simulation bother me than that does? Either way it's not a thing that's happening, it's just a thing a computer says is happening.

(Also, why would the evil computer waste its own resources on ultra high fidelity simulations of billions of miserable people? The nerds are already convinced it would, it doesn't gain anything from following through.)

Roko's basilisk is a thing I ended up reading way, way too much about because I embarrassingly once thought the lesswrongers were smart, and if they were worried about something so gobsmackingly silly it must be because I was misunderstanding something.

nope! "subjective bayesian probability" and "coherent extrapolated volition": gobsmackingly silly but with more words

"IF advanced A.I. is inevitable AND ALSO capable of complex realistic simulation THEN the possibility that we are living in simulated-reality vs. real-reality is essentially a certainty" like okay Clive Pascal did you notice the part where all of that was bullshit