softfennec
@softfennec

losing my mind again over how people who write scifi as a living and call it ai research have managed to convince every journalist on the planet that it's inevitable and only a matter of time before the cool text generator gains the power of god and takes over the world


softfennec
@softfennec

like i guess in journalists' defence there's also a lot of people who really should know better who keep giving interviews like "i saw that the computer can generate text so much better than 5 years ago... therefore ive concluded that it is inevitably going to kill everyone within the next years".

there's no logical process going from one to the other, the middle step is basically "computer gains magic powers", but i have to assume that the journalists writing up these reviews go "well... they wouldn't just make a statement like that based on literally nothing, right? they're the experts after all? they wouldn't just tell me that the moon is made of cheese if it actually isn't"


softfennec
@softfennec

also there's businesses like openai who find that, for some reason, it's really good for business to say "yes, our product will literally become god and kill everyone, and that's why you should fund our research" so they keep making those statements to get more funding


pervocracy
@pervocracy

Explaining the sociology and theology of "rationalism" as Bay Area religious movement makes for a long nerdy story but it's absolutely key to understanding this. "Computers will abruptly become gods... or devils" is a whole thing and has been for about ten years and yes, Harry Potter and race science and Pascal's Wager and a staggering array of "brain enhancing" drugs are all parts of the story and I haven't even gotten to explaining what "acausal trade" is because you're going to think I'm making this stuff up


You must log in to comment.

in reply to @softfennec's post:

in reply to @pervocracy's post:

I think it's more like... you make up a guy and then imagine you're making a deal with him? And the guy for some reason imagines he made a deal with you, and because you are both extremely Logical it will turn out to be the same trade and so you should take care to fulfill the terms of this trade you just imagined?

a lot of rationalism is based on believing you're so smart that you can literally just make shit up and somehow derive factual information from the thing you just made up

Ooof, that would be a lot of work and honestly I don't know of one. It might be out there? Or maybe someday when I have the energy for a BIG project.

One of the obstacles here is that rationalists are unbelievably long-winded writers so all the primary source material is blog and forum posts that are 50,000 words apiece, summarizing them fairly would take a while