Typical transbian leftist. I work in web security and fix typewriters. Not safe for actual work.



Newsletters and industry news sites I follow are increasingly using Midjourney or another AI to provide illustrations where they used to commission art or purchase stock photos.

I want to offer some sympathy and validation to all the artists who warned us this would happen and the impact it will have on their careers. I can't offer much more; I always assumed it would go this way, because "free" trumps all other considerations, especially where Internet Content is concerned -- and who am I to say I wouldn't have taken the easy route myself? But it's another depressing milestone in the history of using technology to ignore society.

We should consider a lot of these AI models to have been built on stolen art, in the same way that GitHub Copilot is built on stolen code. Sure, artists made works available to download, and in some cases put them under copyleft licenses specifically so that they could be reused and remixed. Most artists publish in the full knowledge that their work will not just serve as stylistic inspiration for benevolent new works, but will be reposted without the watermark, traced over, or printed out and framed. But would some of them have made a different choice if they'd known it could be scraped en masse from Flickr/deviantArt/archive.org and fed into an automatic art-generating program?

If you publish something at a certain point in time, you should expect that it's out of your control and can be used in entirely unanticipated ways, because that's always happened, and that's the genesis of new and equally valuable art. But that's different from consenting to every consequence, and I think it's wrong for the researchers who fed this art into their models to decide on their own that they had an unlimited right to do so. What do they have the right to, though?

This is the latest and most thorny chapter in the book of how art offered in good faith gets remixed, commercialized, and co-opted. The entire Internet economy depends on enticing us to provide free content that someone can put ads next to. Elvis practically stole his music from Black artists, and made a fortune while they barely made names for themselves. But it's not just about taking money and fame that could have gone to someone else. What if you take free work, you offer a new free work, and it displaces paid work?

This is something our society has never seriously grappled with. Which makes sense -- we don't even have a system for compensating the people whose labor and wages have been directly replaced by automation and process improvements, even though that's eminently quantifiable (profit sharing! worker ownership!).

But we can't assume that each time our culture eats itself -- or more properly, business eats our culture -- something equal will be born from the ashes. Sure, artists will always create for the love of creating, but if we don't have a system that allows some of them to devote themselves to it, full-time, without a trust fund, I'm convinced that we will have less art, worse art, and emptier art. And we won't ever know what we're missing.

Don't get me wrong, I think AI will lead to new, valuable, and important art. I think it will allow more people to bring more thoughts to life in ways they never thought they could. But I think there is a substantial difference between how AI art has come about and is starting to be used, and past remix culture or artistic cross-pollination. I just don't know where to draw the line besides wherever money gets involved.

Maybe music is the closest analogy, at least because I can crib from pre-existing discourse about white rock-and-rollers and about sampling. Elvis, despite his flaws, still became part of the evolution of blues/soul/rock, and therefore he's inseparable from the crucial input to hip-hop sampling, a Cambrian explosion of not just musical styles but new ways of creating music without ever touching an instrument. Hip-hop is another genre that's thoroughly rooted in and identified with Black musicians, and another where white artists walk a fine line between collaboration and simple co-optation. How do we describe the differences between Public Enemy's It Takes a Nation of Millions to Hold Us Back and DJ Shadow's Endtroducing....., which wouldn't exist as such without dozens and dozens of samples? How do we describe the difference between Dr. Dre, who wouldn't be who he is without Parliament-Funkadelic melodies, and his protege Eminem, who has a kneejerk adversarial relationship to concept of appropriation?

The closest I can get beyond "I know it when I see it" are some general principles: At the very least, credit has to be given; you can't just say "this is mine, I came up with it out of whole cloth." If you make some money off of it, you should clearly pay some back, and in a perfect world, you would somehow pay it forward, too. Maybe it's fair to say that if you remix, you automatically consent to your own work being further remixed -- but ideally you would be part of a two-way conversation with other artists, both taking inspiration and intentionally providing ideas and mentorship back.

I think that last principle brings me to the key thing that's missing from the ongoing genesis of AI art, and missing from our society in general: "Nothing about us without us." AI art is art, but where is the engagement with other artists? Would these researchers (who we do need to consider artists in their own right, now) have made any different choices about how they programmed, fed, and released these models if they had talked more with other artists, worked through some of the possibilities and consequences together? Would it have led to better AIs? We'll probably never know.

In the end, it may not matter. There's no closing the Pandora's box on the concept of AI art now that we know about it, and there isn't enough political will to make people throw away the tainted models and start over. Attributing their outputs to specific inputs is basically impossible, and what's more, people are probably training new models on the output of existing ones -- which means things are going to get Weird as errors compound over time, and I admit I'm interested to see how that will go.

As always, culture will continue from the point we have instead of another that might have been. It will probably be "fine," but maybe next time we'll consider that "better" is never guaranteed.


You must log in to comment.

in reply to @hackermatic's post:

I feel about the various AI art generators as I feel about baloney or cheap t-shirts: people aren't interested in how it gets made because they wouldn't like it. The same is true for art: few people cared about artists and now we can have art without artists. I'll go back to real media and be disgusted by the models, until one comes along that is ethically sourced: with compensation and consent for the works in the training set.