snowmiaux

social justice snowcat

Slinky and sassy, Ghostpaw is a snow leopard of many gifts; A waterfall of luxurious, white, waist-length hair, an even six feet of curvy physique, and sultry, violet eyes. But his tail is even thicker, longer, and more sensuous than these. Thick white-grey fur patterned in dense black rosettes could easily hide his figure, but instead the smooth, sumptuously trimmed fluff highlights his toned form. Waist down and from behind, Ghostpaw's curves are quite femme - long hair distracts from strong shoulders as a shapely heart of an ass supports his lush tail. From the front and abs-up, those shoulders and a beard-like tuft on his boxy chin are far more masculine. He is as much an aphrodite as an adonis - a beauty with gender signals too weak to prevent a quick, mistaken glance, but strong enough to confirm with a lingering, appreciative look.


snowmiaux.com
www.snowmiaux.com/
Telegram: @Ghostpaw
telegram.org/
https://meow.social/@Ghostpaw
meow.social/@Ghostpaw

exodrifter
@exodrifter

I think a lot of people really underestimate how difficult it is to translate what we want in our heads to having someone else understand it. However, this is problem is treated as practically non-existent whenever AI is marketed as a tool for creating content.

This conceit reveals itself as soon as a text prompt is used as the input. The idea here, if it isn't obvious enough, is that a text prompt is the easiest, fastest way for you to get what you want from the AI. However, the languages we use still have a TON of ambiguities. It's very difficult to specify exactly what you want to someone, AI or not.

This is why you'll see, like what is being done with AI right now, a lot of time being spent trying to figure out how to reduce that ambiguity when trying to get an AI to do whatever the creators think "the right thing" is. You get AI models trained to make specific kinds of things, you get more ways to nudge the AI in what is hopefully the right direction, and you get the creation of the "prompt engineer" role whose purpose is to figure out how to speak to the AI in the "right way" to produce "the right thing".

Honestly, I imagine most situations where companies try to use AI look like this...

company: Wow, AI can replace artists!
company: Oh, actually, I can't get the AI to do what I want. Maybe we want a prompt engineer?
company: But if we want to replace artists to lower our payroll, we'll have to pay them less than we pay artists for this to be worth it.
prompt engineer: oh no

...and now, the company is talking to a human again, instead of an AI. Why? Because they feel like a human can understand them better than an AI can. But, well, that's at least one reason for why you had an artist in the first place.

And now, I'm going to quote the article On the foolishness of "natural language processing" by Dijkstra, because it's so based:

The virtue of formal texts is that their manipulations, in order to be legitimate, need to satisfy only a few simple rules; they are, when you come to think of it, an amazingly effective tool for ruling out all sorts of nonsense that, when we use our native tongues, are almost impossible to avoid.


You must log in to comment.