mrhands

Sexy game(s) maker

  • he/him

I do UI programming for AAA games and I have opinions about adult games


Discord
mrhands31

mtrc
@mtrc

When we listen to people talk about the benefits of a new technology, they often talk in ways that sound quite utopian. For example, if you go onto the website of Github's Copilot, a ChatGPT-like tool specialised in writing code, you'll run into a lot of very exciting-sounding blurb that explains its many benefits:

I have felt improvements of 50%, the process of getting started is very simple.
Sebastian Barrios // VP of Technology


mrhands
@mrhands

Luddites were skilled laborers whose jobs were "made redundant" with the introduction of weaving machines. Yes, the new jobs were better, but you only need a handful of weavers to manage dozens of machines. And the Industrial Revolution meant that everyone needed a job to survive because people could no longer live off the land as they had in centuries past. The natural response then was anger against the machines threatening their livelihood.

AI for code generation is currently dogshit but improving exponentially. Pretty soon, the programming jobs left will be for senior programmers like me. And I will be expected to "tweak the output" of these algorithms until the people in charge can get rid of me too.

I hope the owning class realizes soon that Universal Basic Income is preferable to people realizing they have nothing left to lose anyway.


You must log in to comment.

in reply to @mtrc's post:

Just to draw out something you mention but leave a little implicit - the scenario you're describing is explicitly - 2 fewer developers, but instead an expensive subscription fee. A shift from spending on labour to spending on paying another company for... well, partly their labour, but mainly rent on their IP (the models) and running some data centers.

And it's also worth thinking about who has the better negotiating position when it comes to haggling over the Copilot subscription fee - the small shop with 10 devs, or a company with tens of thousands. Software development, unusually, tends to inversely scale with size, because co-ordination costs are such a big thing - but here's a way to neutralise that effect a little.

i'm actually also going to link a post i made here, because this one definitely makes me think of it. on the way that the invention of PowerPoint displaced a lot of labour, especially of those "boring" graphic design jobs you mention. https://cohost.org/v21/post/3117267-a-few-things-on-this

I'm just so tired of, what, 50 years of "oh this will increase productivity! Now productivity is even higher than before." And then wages stay flat Taking these companies at their word is just another one of those, don't even need to look at the list jobs thing.

I think thats what bugs me most about tech evangelists is that they're trying to sell me on something that's "for the good of everyone", but it's so easy to see how the tangible benefits are just flowing upwards.

It's very tiring, and I find myself particularly conflicted about it the more I am asked to discuss the future, government policy, research directions and so on. How do we do research that actually benefits people? Sometimes it's clear but mostly it's quite, quite unclear.

I think one of the even more insidious things about the 10-programmer scenario you created, is that it is very likely that some of those programmers will be more senior, and some more junior. What do the juniors usually do in order to learn? Well, generally, the more rote/boring/repetitive tasks that the seniors don't have time for, and that the juniors are still qualified to do. If those tasks are the ones being relegated to AI and everyone is using it, what happens to junior roles? i.e., how does anyone learn how to become a senior developer? (Or get a job in the first place?)

And, as you noted at the end - "interesting" often equals more complexity in the task. It's good for cognitive labourers to have some ebb and flow in that complexity throughout the workday. I wouldn't want to be working on my hardest tasks 24/7, after all.

Yes. I worry about this a lot at my university too. I see so many students using Copilot, ChatGPT and other tools, and while I can see their use in so many cases, I worry... what does the future look like when everyone uses these tools? Maybe it'll be fine, and it'll just become another layer to how we work, like how no-one needs to look up logarithms in books any more. Or... maybe something else? Maybe we'll just be left with a huge gap in knowledge and experience. That really worries me.

I do really appreciate the insight here, but I have to further preface my thoughts with this: I'm a CS grad (not in grad school) trying to figure out what I can do with software development that can help me move forward without contributing toward making the world worse, or even without making a positive impact. There is a lot I don't know, or currently care about, here. Ideally I'd just make games, but that doesn't offer stability, nor does it address my growing discontent with the way things are going in the world (death knell of capitalism, rigidly organized society, work, and thought, lack of human compassion and acceptance, etc.).

So...

Why is it valuable/important to research and work on things like AI game designers? I don't fully hate the concept of AI, even as it stands. I've heard things about being able to run simulations like protein folding or climate modeling, which sound legitimately useful for research that benefits humanity. I know it can also help with accessibility to art, and other information (closed captions, TTS, etc.?). I like that various flavors of autocomplete for writing code (like finishing a simple line, or dropping in brackets and other basic structures) speed up the process of actually making stuff. I just really do not understand trying to optimize art, especially when there's a litany of problems with the perception and implementation of AI, basically everywhere.

Sure, I think it can be interesting, but it's a more of a philosophical curiosity to me than a problem that should, or even can, be solved. Procgen is neat, I'd like to get to the point where I can mess with it. I'd also be intrigued by things that could incorporate data sets (art, text) made by the creators of the software used to output things that are like art, or even be considered art themselves. As far as expanding the range and possibilities of individual/team expression, I think there's things worthy of discussion there. I still don't see why broad research on this is currently worthwhile, instead of making AI into something that can do beneficial things well, because specialized models to do very specific tasks can be used to great effect, I think(?).

I'm sorry, I know you're deep in this field, and have already said a lot, and done a lot of work, that pushes against the kinds of things my words here have implied, but I just feel like I Do Not Get It.

Hey Fungal! This is a great question and there's no need to apologise for it - in fact I've had to answer it a few times in the past, and the most brutal questioning I got was from school students! I've been doing this for a long time, long before the AI boom started, and my philosophy was always to make small systems that worked alongside people and engaged with creative communities.

Some of my (current) thinking about why this is useful can be found here: https://www.youtube.com/watch?v=bhAj_NfqJQw. It explicitly talks about the need to push back against the politics of modern AI (I don't actually use machine learning, datasets, or optimise for creativity). You can also find a longer, more documentary-like video I made on automated game design here: https://www.youtube.com/watch?v=dZv-vRrnHDA which explicitly critiques the idea that AI can save us time or money.

However, with your permission I'd like to write a longer-form answer to this as a post sometime, so I can break down how I currently think about this task. The answer depends on who's asking, but I've thought long and hard about how to do my research in a way that enriches designers' lives, acts as an extension of my creative practice, and makes the world more harmlessly joyful.

Thank you for the kind and thoughtful response. I don't usually come at this subject with an earnest curiosity and respect for the ideas and work involved. Your apparent awareness of the "hate" and understanding of it, while meaningfully contributing your own thoughts to whatever this discourse is, helped me trust that I wouldn't be skewered if I expressed my greivances, even in the midst of feeling like I wanted to burn everything down.

I do like some things from your response and your talk on Puck that you linked. The parts about rejecting the idea of linear progress and predestination(?) were pretty resonant with my own feelings. I think I can kind of appreciate Puck as a research tool, or maybe even toy, as opposed to a "designer" or generator. I'm still fuzzy on what/if there is an overarching purpose to Puck, as I don't imagine AI being able to output something close to the level of "polish" and "quality" of a work created by human designers, artists, developers, etc. Maybe it's my doomerism getting in the way of me better understanding this.

I also do not really understand how any of this works and tend to feel like that knowledge gap is trivial compared to the monolith of corporate AI dominating tech and many notions of what art is/should be (or labor and society, for that matter). Still, the idea of AI without machine learning or datasets is intriguing, even if I know even less about that approach.

I don't want to be afraid of the concept of AI (in the sense of endless sludge and erasure of human potential, not the nanoscopic chance it leads to actual machine sentience), but I hardly ever hear anything about it that makes me feel like it's worthwhile.

I would absolutely be interested in seeing you elaborate on these kinds of concerns and your counterpoints. I don't feel that my specific comment/questions deserve a dedicated post, but I also don't know if that's even what you were implying. That, perhaps further generalized, unease and anger seems like a good topic though. However many voices are involved in said future post, I currently do not wish to be directly credited or referenced.

(I hope that made sense)

Thank you.

Hey again,

This is such a great and fascinating follow-up comment and I think you've made me realise a few things that have fallen out of my mind in recent years. In 2014, for example, I knew most people I spoke to didn't know what AI was, because no-one did. However over the last decade, because it's everywhere, I think my mindset had become lazy to the point of assuming that since everyone knows about machine learning, they probably had heard about other stuff too. But that's a bad assumption and totally not the case! So yes this was super helpful.

"I don't imagine AI being able to output something close to the level of "polish" and "quality" of a work created by human designers, artists, developers, etc."

Certainly not any AI that I make! Haha. And one of the big thrusts of my work is that, even if Puck could make AAA videogames every day, like better than the best blockbuster, it still wouldn't matter to us because games (and creativity) are about expressing and communicating and meaning. That's why what I really want to do is make an AI that can be your friend, or your mentor, or your collaborator, or whatever. Tiny, social robots that we understand, control and get support from.

I definitely won't credit you in the future post as you request, but I do appreciate it, it's given me a relaly nice idea of something I'd liek to write or maybe write a talk on.

It's perfectly fine to feel fear, doom or anything else in response to this. I feel the same things on a daily basis. And it's increasingly hard to be in the AI space personally. But - just like all aspects of capitalism, we can find ways to resist its worst excesses together, and figure out a way forward that we want to take. I can't stop any of the doom coming our way, but I can promise to be here and talk to you and work with you on helping build brighter futures inside of the storm.

Thanks again :)

Wow. I didn't expect this kind of dialogue. I'm just a person who knows nothing about the thing I've spent so many hours despising.

I may be misreading your words, but I don't intend to do anything formal about these thoughts and feelings of mine. I don't know where or how to begin with something like that, and I'm not currently convinced I could contribute to anything around this while I feel like myself and so many things outside of me are in shambles.

I also feel like this revulsion has led me to feel uncomfortable with how things are done in tech by individuals because of ingrained conventions(?), or at least my vague notions of things like dependencies upon dependencies, UX, massive webdev frameworks...I feel like I could go pretty deep down this rabbit hole without it really amounting to anything.

I guess on top of that, I do worry about the massive scale and power consumption of corporate AI, but I don't know the specifics there either.

Thank you for earnestly engaging with my vague unease about this. I do hope this protracted interaction was of some use, like you said, though reading that did shock me.

"I don't intend to do anything formal about these thoughts and feelings of mine."

You don't have to do anything grand! We all do things every day - even posting your original comment was you speaking out about this thing you feel strongly about. It's not about grand gestures, but about the small things we do every day that build up communities and trends :)

A lot of the state of things is pretty awful. And yet, people like you and I exist who care and worry about these things, despite being in the middle of the tech industry ourselves! So let that be some comfort and least, and belief that things can change.

in reply to @mrhands's post: