ann-arcana

Queen of Burgers 🍔

Writer, game designer, engineer, bisexual tranthing, FFXIV addict

OC: Anna Verde - Primal/Excalibur, Empyreum W12 P14

Mare: E6M76HDMVU
. . .



nex3
@nex3

I think part of the reality around stochastic-model art that gets lost in a lot of the discussions is the materiality of the money behind it. One of the rehosts of my post from yesterday mentioned people using it as a "free art button", but I think it's really important to understand that it's only free because it's subsidized. Maintaining these massive probability databases isn't free, nor is scraping the entire internet to generate them, nor indeed is running an individual probability cascade to generate one image. And that's not even getting into the colossal number of engineer-hours invested to create the underlying technology—extremely expensive engineers who are a highly-specialized subset of the already-overpaid tech sector.

The companies that are allowing people to use these models are making a conscious business decision to underwrite randos on the internet using them for free. They're also making a decision not to try to enforce any kind of (dubious) copyright claim on the images their models generate. The interesting questions aren't "is this art" (imo it is bad art) or "is this intrinsically moral", they're "who are the capitalists exploiting with this" and "how does that exploitation function".

I hate and oppose this technology not because it's impossible to use it to create good art, but because the overwhelming actual use will be to harm artists and culture. As @tef likes to say, "the purpose of a system is what it does", and so far these stochastic models mostly seem to convince people with money to give less of it to people who make art. This is great for capitalists because it means they have to pay less for their art even if it cannot be made by these models, but more importantly because it turns the means of art production into capital. Suddenly whoever has the most money to invest in data centers and Ph.D engineers gets to charge rent on all the low-end commissions that would otherwise have gone to a bunch of different humans.

Technology is not neutral. If we fall into the trap of thinking about it as an object divorced of its material context, the capitalists have already won.


ann-arcana
@ann-arcana

It's the Uber strategy all over again, the only playbook tech has at this point.

The purpose of corporate generative art is the deliberate attempt to try and "break" another industry where, as yet, they have not been able to fully commoditize and control the entire landscape.

It's as simple as that.

I genuinely believe that the rich hate artists. Creativity is at once something they lack, and that they cannot control, and so they resent it intensely, and in the tech billionaire class there seems to be a particularly acute form of this resentment.

No matter how rich, how powerful, the money men live only on the ideas of other more creative minds, and they fucking hate us for needing us.

They have spent decades campaigning against the idea that artists should own the fruit of their labors and the means of their production, even as they sought to consolidate that ownership to themselves, and now, still unable to wrest absolute control, their fevered minds have convinced themselves that the infinite monkeys are real and if they just light enough oil and cash on fire they can replace us altogether.

I despise it for what it is: nothing less than an existential insult. A statement of intent, of their utter antipathy to us.

They are the enemy, and this is only their latest weapon.


tati
@tati

I fully agree with Annie D's analysis. The technology is a means to the specific end of driving down the cost of art labor to the ground by exploiting a legal landscape that's too slow (and hostile to labor) to keep up with tech challenges to its meager labor protections.

The AI is simply an insulation layer from individual intellectual property challenges. Its benefits to the owners of the model are protection from legal liability and flooding the market to drive down the costs of labor.

I think it's interesting to consider the origins of 'Luddites' in this context. They were skilled workers who destroyed machinery to protest protracted mistreatment and underpayment from their bosses, who subsequently had them killed. Over time the term has been vulgarized to mean "anti-tech", but it's true origins are still somewhere deep in the semantic network, because its precisely when you point out potential abuse of a technology by capital that the label gets brought out.

That's why I find the framing of the abuse of the legal framework to be more illustrative of what's actually happening: uber leveraged exploitable contractor laws everywhere almost at once (in legal timescales) and in the process wreaked havoc on existing taxi unions and the 'driver' labor market overall, for the purposes of building a monopoly. this was its purpose and the reason why it continued to receive funding year over year. Capital wants to do the same for other (actually all) labor markets.


You must log in to comment.

in reply to @nex3's post:

Agree. And this is not a "future possible problem" , btw. Macmillian and Tor.com are already using "AI" art for book covers (that look like shit, they didn't even bother to hire someone to clean it up so the human figures are... geometrically challenged). Which is another way book publishers found to underpaid artists, just like they underpay editors and writers and... you know, the people that are involved in actually creating the book.

It's practically a guarantee that they'll be charging for it in short order. It'll still be significantly cheaper than commissioning an artist that has to put in actual physical work, but there is no way they're not going to monetize it. Probably when it gets good enough at hands for people to stop mocking them.

It's always shocking (but not surprising) to see ostensibly anti-capitalist people defend their AI indulgences. I've even seen artist friends using this shit. If I "just don't understand it", then I don't want to understand it. I'm tired of being required to understand the nuances of blips on a downward trajectory into an abyss.

there's nothing inherently capitalist about the thing itself: it's a kind of visual randomizer (or paragraph blender, if you want the language one, but I'll just talk about image synthesis here). they're interesting and even kind of engaging to play with, though kinda shallow.

how they were built, what goes into them, how they're valued, and how their creators and a lot of others hope for them to be used are all bleakly capitalist.

i maintain that planning to Disrupt and Revolutionize and Capitalize artistic endeavors is very different than just fuckin around seeing if you can make a picture of grimace committing an OSHA violation or clowns crashing a rowboat into the twin towers or whatever. i don't see what's shocking about exploring an interesting, randomized curve full of semi-sensical noise while also disliking capitalism

Something I'm missing from this is: Usually giving out a free or cheap service to the masses in the tech industry is done to gain adoption, then abuse this adoption to bring in a more realistic price.

I don't doubt better models will be developed that will be attached to a service that charges a premium (still cheaper than freelance). But if the goal is to develop a system that allows companies to underpay for art, what would this extra adoption bring? Training data?

wait it's training data isn't it

I think training data is some of it, but I think it's mostly just advertising. They want everyone to. be familiar with this stuff and used to using it so that once they start charging (less than human artists) people will think of it as an easy and readily accessible tool for a small fee. If it had only ever been available to researchers and a few journalists (as it was initially) before being made available for charge, there's a good chance the public would have seen it as inaccessible and specialized.

Worth noting that this is exactly what AI Dungeon did. Started free, then sold the tech to VC and went commercial and started locking shit down and spying on users.

It's what they all do, because it's the only thing tech knows how to do.

in reply to @ann-arcana's post:

we do agree with this analysis. it wouldn't have to be about hatred, because financially speaking, that which cannot be controlled by capital is a threat to capital and so the incentive is to destroy it, but we do think it winds up being about hatred in many cases. there's rhetoric about it and everything.

Being around tech people and culture the last ten years has been incredibly enlightening as to the antipathy some people have towards anyone in the arts. Certainly I dealt with a different sort of working class anti-art rhetoric about "real jobs" growing up, but nothing like the way techies and STEM folk talk about them sometimes, and I've come to realize it's just a reflection of where the capital flows today.