Everything got better when I became a green-haired 2D girl. I do fun and unusual things with video games and pinball.

cohost inspired me to do more. Thank you



SamKeeper
@SamKeeper

for an awful lot of reasons, the notion of the "Paperclip Optimizer" has a lot of purchase right now. it's the precursor to what eventually might be "grey goo" or per the Culture novels a "hegemonizing swarm", a dumb system designed to do nothing but expand its capacity to convert everything into a reflection of its initial programming, i.e., turn all the matter in the universe into paperclips.

there was even a web game about it!

this article I wrote a couple years ago is about that game. I think it's worth reposting now cause people keep talking about the paperclip optimizer as a parable about dangerous dumb systems. and that's true, that's what the game is about! the game is very much a deliberate allegory meant to explain why you should support "friendly AI" grifters!

what this article proposes is: maybe you shouldn't do that, actually, because behind every "rogue AI" is actually some capitalist somewhere making a decision to make All The Money, damn the consequences. this is an article about playing Universal Paperclips radically wrong--both radically wrong mechanically, and radically wrong emotionally. what I think falls out when you shake the game that way is a lot of unstated assumptions about shit that's acceptable for human beings to inflict on each other but somehow monstrous when a machine is doing it.

like, I get that we're all attempting to be more materialist in our analysis and that's good, but sometimes it feels like we're sliding into a kind of Lovecraftian understanding of the corporation, like it's just this incomprehensible machine working for itself. but at every stage there's people making decisions and they COULD be held accountable! and also, there's a designer of this game making decisions about where to put content emphasis, in order to put a finger on the scales of the parable. you don't HAVE to inflict mind control drones on humanity in the game any more than people HAVE to use deceptive advertising practices.

and by the same token like, it's actually perfectly reasonable for someone who isn't in STEM to look at a search engine spitting out wrong results and say hey, this search engine is bad! you can say "ah but technically machine learning is not intended to output correct results, you've made a Category Error" all you want; a human being sold this to other human beings as an intelligent search engine, and that sale was based on a whole series of lies. the technical explanation can be helpful, but it's not the point. the point is that a human attempted to harm other human beings with technology, something we've been doing roughly since the opening sequence of 2001 A Space Odyssey.

anyway there's a lot of weird maybe kinda heterodox perspectives in this article that I still haven't really seen anywhere else but that still really guide a lot of my thinking about this tech. read it if it sounds interesting I guess!


arborelia
@arborelia

I've been wary about engaging with the game "Universal Paperclips" because the story of the paperclip maximizer comes originally from Nick Bostrom. The eugenics guy who is a thought leader in "effective altruism" and particularly in the "X-risk" doomsday cult part of it. The guy who recently was revealed to be very racist on top of his efforts to put a positive spin on eugenics.

It wasn't something I could speak authoritatively about, because, y'know, I didn't play the game. I've seen commentated speedruns of it. It seems to at least have sci-fi ideas in it more interesting than any sci-fi that Bostrom ever presented as fact.

And I wondered if, by reformulating the concept as a clicker game, it was perhaps subverting it, like the way the movie "Starship Troopers" subverted the ideas of the book's author.

For one thing, you could take the "paperclip maximizer" as a cautionary parable about capitalism! We have unfriendly artificial entities that operate according to weird rules right now, and we grant personhood to them; they're corporations. They are willing to destroy things that humans hold dear in pursuit of their singular goal of being "money maximizers".

But the true believers in "X-risk" don't take it that way. To them, it's not a cautionary metaphor for capitalism, but a literal thing that we need to prevent.1 To them, the most important thing in the world is for them to do as much capitalism as possible so they can win the race to build a "friendly AI" first. They back this belief up with other writings by the racist eugenicist Nick Bostrom.

So does the game have something more to say than the cult mindset that created its source material?

It sounds like the short answer is no, not really. But you really should read the article. I'm going to link it again even though it's the big embed at the top,. just to be sure.


  1. I don't mean to claim that they literally believe it will be paperclips. But they do literally believe in a near-future all-powerful AI god.


You must log in to comment.