"Effective Altruism"? That was a solved question 154 years ago: you pay an autistic friend of yours $60k a year to synthesise an entire new school of economic and political thought.
queer code witch - 18
discord @mintexists
send me asks! :3
the only ethical use of generative AI tools is elaborate shitposting. it won't put anyone out of a job, actually allows some innovation in the form (unlike most other uses of the tools), and actively costs the companies money for little or no benefit on their part, if not an active detriment
generative AI imagery's specific powerful niche is "find the most remote corners of the network and edge cases that slide right between powerful concept clusters" to make something really fucked up that's an artifact not of the network's intent (epic sexy elon chad, standing on mars, ready player one, badass, corvette darkwave), but of its training and origins.
without generative AI we wouldn't have portrait photograph of homer simpson holding his dog or portrait photograph of peter griffin embracing homer simpson. and that's not just because a traditional artist wouldn't paint something like this, it's because I created the images with the intent of highlighting something about the network and tool itself. the pictures aren't about homer simpson, they're about the features the training algorithms extracted from the absolutely massive training dataset and interpreted as "homer simpson," and the effect of structuring a prompt in exactly the right way to cause some kind of horrifying photograph/homer metastability. it's about the way the network rings when you strike it hard enough.
it's built to optimize its generated image to maximally match it to two mutually exclusive ideas and makes something horrifying. and to me that's something truly fascinating. all it takes to cause model collapse is eight carefully selected words.
and that's also one reason generative AI sucks so much in general: the hucksters who are trying to sell it as a solution to anything are trying as hard as they can to get rid of the weirdest, most interesting parts of its output, because it makes them look bad
I also would strongly encourage the use of generative ai in ways that attempt to highlight racial training biases, but that's less fun than photorealistic homer
Thesis: "AI art is only good for shitposts about Homer simpson"
Antithesis: "Using AI art to highlight biases and marginalization is real art, and profoundly human"
Synthesis: ethinically ambigaus Homer
Truly the dialectic is in motion today
I wrote this in an e-mail to people, but I figured I'd copy part of the response here since people keep asking.
…
We have people writing critical software. They are not migrating to new software anytime soon (modulo regulation-based incentives). But they have serious problems. Everything from vulnerabilities that are used by nation-state actors to quell dissidents, to not being able to change a typedef like intmax_t because the functions tied to it are baked into specific named symbols in an invisible way (ABI), to constantly seeing people's names getting butchered by Airlines, Databases, and Governments because they're using software that relies on the C locale and mangles names.
These are C problems. Not C++ problems. Not Java problems. Not Rust problems.
C problems.
My job is to solve C problems. That's the motivation. That's the coherent plan. When we stop having long-term, 20-to-40+ year problems, with 30+-year implemented existing practice that we never standardize despite it solving a wide variety of problems, that's when I'll stop writing C proposals.
Watching the 2nd hunger games movie. I don't rlly care much about the romance but the uprising is kinda interesting plot wise. I don't remember the twist I just know there is one so this is gonna be really interesting