There's a reason I refer to "AI art" as statistical picture generators; not only do they have no understanding of art, or indeed pictures; they don't even leverage any human understanding of art, or indeed really pictures, in they way they do it.
You feed a bunch of art into one program, and it builds a statistical model relating metadata to pixel values. You feed new metadata ("queries") into another, and that program uses the model to interpolate a new, statistically correlated, pixel collection.
The trick works, to the extent it even does (it's really visible that it doesn't work on a human level! Hands!) the way that statistical measures usually do — only at scale.
They didn't train the models on literally billions of ethically and legally dubiously sourced images because they thought that was a great idea — they did it because the trick works worse and worse the less data you train it on. The indefensible data slurping is an intrinsic practical necessity.
Read it again, slowly: "AI art" only works, and can only work, on the back of stolen images, because it works, and can only work, when fed images in bulk exceeding any human capacity to legally, never mind ethically or considerately, source.
They have to feed it stolen data. They have to. Or it doesn't even fucking work.
Friends, I understand the impulse to say "well, you know, the problem is the capitalists monetising it, and/or the way it's deployed; surely surely, if we wrest it from them and put 'AI art' in the hands of artists — like a Photoshop filter! — it will no more destroy Art than Photoshop filters did."
But I'm afraid you have to acknowledge that the "Photoshop filters didn't destroy art" thing is an analogy. Photoshop filters are not powered by mass data theft as an unavoidable necessity.
Blood diamonds do not become ethical if they're only worn by nice people. "AI art" picture generators cannot be made ethical by only nice artists using them. There are ethical problems in the supply chain which are fundamental to the picture generators working at all.