Malusdraco

re-entering my dragon era

  • they/them it/its

Malus || 27 || genderqueer aro-ace

should probably put more here but i'm a white ass IT professional and artist as a hobby. very opinionated but working on being less rude


Toyhouse (most art)
toyhou.se/Malusdraco
Ko-fi (digital sketchbooks, commissions and more)
ko-fi.com/malusdraco
Neocities (not super active)
witches-garden.neocities.org/
Dreamwidth (new home?)
malusdraco.dreamwidth.org/
Letterboxd (for the freaks)
letterboxd.com/malusdraco/
You must log in to comment.

in reply to @Malusdraco's post:

Yeah somewhere along the way people got so wrapped up in the 'ai war' that attempting any kind of nuance on the subject gets shouted down without thinking. I remember in a discord server a while ago I said AI art wasn't inherently evil and someone lost their schmoe and started accusing me of shit. I don't want to say people need to calm down because people are losing their jobs over this shit but maybe the internet frenzy invites too reactive a response.

There is this thing with new technology where people blame the tools. Been true since the Luddites smashed the mechanical looms.

It's never the technology, never the machines. Machines are morally neutral. Even the mythical "orphan-crushing machine" isn't cackling with glee about crushing orphans, it's just a machine. The operators are the moral agents, and have moral agency.

A mechanical loom can save the labor of dozens, if not hundreds, of people. A large language model or other generative "AI" system can be used for incredibly important work that would be incredibly tedious for a human to do. They're already using similar "large data set" training methodology to identify cancerous cells and generate novel molecular structures which could have medicinal uses.

It's not the machine. It's the people. A rock is a malicious technology if someone bludgeons you over the head with it, but that's only because they used it to bludgeon you. A large language model trained on only public domain and freely-given works could be an amazing tool, or even just a fun toy. Same with an "art AI" that uses public domain work. The problem comes when the owners of the thing try to use it to put other creators out of business.

An artist or artisan shouldn't have to fight to sustain themselves on the proceeds of their creations. That's the end of that sentence. Having to fight against "AI" is no different from having to fight against other artists. Warhol's "art factory" screen-printing set-up, Thomas Kinkade's use of "apprentices" to "finish" most of the work bearing his signature, it's all the same thing. The issue isn't art. Is Duchamp's readymade series not art because he merely bought the pieces and signed them? Is a remix not a song?

It's not the tool, it's how it's used. In this case, people are using them to try and displace workers so that they can seize the money which would have been given to them for their labor. But the reduction in labor is not a bad thing in itself! There is no inherent dignity or moral value in it! If the Luddites had been able to stop crouching over a loom all day every day just to feed themselves, and were simply allowed to feed themselves, that would have been great for everybody except the guy who hired a bunch of artisans to work looms, used the value he took from their labor to buy a mechanical loom, and then fired them.

"AI" intervention has the potential to make lots of types of work less crappy. The problem is that the people building them are building them with the intention of using them to make it so that less people have to work for them so they have to pay less people. Less people having to work is good! If someone doesn't have to look at tens of thousands of slide images of biopsy samples, and instead gets to do something more productive - that's good! If an artist doesn't have to make a specific background image for a big animation company, and gets to make something they personally want to - that's good!

The problem is that the people putting these systems into service are going "well, the machine does it - so go starve." That's not the machine's fault. The machine has no intent. It just does what it does. It's the owner of the rock, or the "AI," using it to bludgeon you that's the problem.

"AI" isn't bad. The ownership class is.

Sorry, this got WAY longer than I planned for it to. I guess it just struck a nerve with me. "This new technology is dangerous and bad" is a very reactionary mindset in my opinion. Everything ever invented has had somebody saying it's going to ruin everything, from the printing press to genetically-engineered vaccines. It's never the technology itself - not even the nuclear bomb wants to hurt somebody. It just... exists. An object and the laws of physics. Nuclear fission can power spacecraft or annihilate cities - it's how it's used and who's using it that way. Same with this.

Pinned Tags