TalenLee

As Yet Untitled Work

I'm Talen! I make videos and articles and games and graphic designs and guides and messes and encouragement. Chances are you can find anything I do on my blog. I like it when you comment on my things, so please do!


I've been working with generative art machines and recommendation algorithsm and whatnot and realise how hard it is for these machines to successfully divine what I don't want them to do. It's pretty easy for an algorithm to find something I might like based on common traits, but it seems that none of these systems are capable of working out what 'without' means.

Like, I watch some cooking shows on youtube. I watch very, very few cooking shows. I am subscribed to almost none of them. So they will time to time show me cooking shows. And it seems to me that coffee shows are in the same bubble as cooking shows.

I tell Youtube, over and over, don't show me coffee stuff. Don't! I don't drink it, I don't like it, nobody is going to convince me to change my opinion with a fucking youtube video, I am interested, sure, in recipes for gyoza I'll never make, but I'm never going to want to watch coffee material. But it doesn't know it's 'coffee content' I don't like, because they show me a bunch of cooking videos and I don't look at 99% of them. So sure, I don't like this coffee show and that coffee show and that coffee show, but who can say it's because it's a coffee thing?

Generative art toys struggle with being told 'don't do this.' You can tell them without things or not including things, or tell them to downplay things, but because these things aren't operating on a theory of mind or representing a real thing that exists, if you ask them 'hey, make a picture of a cute anime boy with long hair, with his hand behind his back'

... it doesn't know where 'behind his back' is. A thing that consistently makes these things struggle is diegetic nonrepresentation. You can tell it 'don't show hands' but that doesn't mean it's going to put the hand behind a back. It's one of those reminders that these systems aren't working off understanding you, they're looking at a set of data and plugging it into other data.


You must log in to comment.

in reply to @TalenLee's post:

Another way to explain this that might work for some people is that the generator does not actually know how to comprehend a sentence, it just has a concept of what a hand is and it you gave it the word "hand" and you're getting hands now