People playing around with computers and the things that you can do with words to make things that create new and novel things
Over the pandemic I was working on a silly project of my own to generate fully-structures fake GameFAQs guides to RPGs that don't exist. It was a lot of fun, but I got hung up on generating the game worlds and also the ASCII art and then I got employed, referring to prior works as inspiration as I went through and handmade all the various rules and bit of logic
And then, just like with every meme, it stopped being cool when the corps decided to get in on it. And like everyone I know who does AI or AI-adjacent things is now wrestling with whether or not they're even allowed to keep working on their cool projects because of the State of Things.
I learned AI when AI didn't work. When you worked on AI because you had big dreams that nobody wanted to pay for.
When a neural net was a silly trick that could learn to play backgammon or something, and the cutting edge was someone figured out how to make one that is devastatingly good at Race for the Galaxy, but it was pretty clear you needed to try something else to solve most problems.
When you could try applying all the latest tech to build a fancy decision model, and then find out it underperformed something simple like a Naive Bayes or Random Forest classifier, and it's not that those models were capable of learning complex things, but they were capable of learning the simple things you overlooked.
When you could get a boost by collecting a better corpus than anyone else had put together, and there wasn't a corporation that had collected literally all of it that could possibly be collected.
We tried a bunch of things and they mostly sucked, and once in a while you could make something cool happen in an unexpected way.
What joy of discovery is there in AI now? I see kids whose idea of an AI discovery is figuring out the magic words to whisper to a product owned by OpenAI, Microsoft, or Google. That's all they've got, I guess, because nobody could make something on their own that competes with a billion-dollar corporation with a financial incentive to make an AI model that devours everything.
The AI winter of the '90s came about because people got paid big money to make big promises that they couldn't deliver on. It seems like we should be heading there again. Hugely expensive Make Shit Up Machines are being presented as oracles of knowledge, and their backers say "oh we'll fix the hallucination problem later", but the 'hallucination problem' is in the design of the system and nobody working on it has the intellectual curiosity to design their way out of this local maximum, not when they're paid so well to stay in it.
I need AI to be unprofitable again.