- The total amount of flops for compute in AI/ML worldwide is a Stupidly Huge Number
- The overwhelming majority of AI/ML compute goes towards tokens processed by LLMs/genAI
- Nvidia remains a stupidly successful money printer
- Companies who buy up Nvidia chips for these "AI factories" are also making serious fucking bank (amused at the remark of these facilities being referred to as "revenue factories" before bringing up "intelligence" tasks)
- They 100% expect the money train to continue, yet frame it in a context of being able to grow infinitely, but "sustainably", like that's even possible. Amazon is building a massive new "AI factory" with "Secure AI"
- GPU abstraction has now reached a high enough level that apparently no one has to care about partitioning tasks for discrete hardware and memory units, and they can do it at GPU bus speeds, if I'm understanding correctly? (courtesy their shiny new Blackwell architecture)
- The future of AI/ML tasks is in "microservices", deploying what sounds like Docker containers for small tasks whose output can be aggregated into a "Super AI" for final evaluation. So... serverless for AI tasks?
- Oh yeah, some people occasionally use our hardware for Actual Science instead of art theft, can't forget those guys
- They have fucking Star Wars-style little cute beeping, walking droids now that you could just dump on a Star Wars movie set and no one would question it (trained to walk in Isaac Sim, an nvidia Omniverse app)
- It's very clear to me that nvidia is going to continue dominating the fuck out of this market segment
I dunno what else to say other than this. I know there are plenty of people who are hoping that the AI bubble is going to burst aaaaaaaaany moment now, but I don't think that's necessarily the case, because unlike crypto and NFTs, LLMs and genAI are something even little Timmy can get in on to generate a picture of a Genshin Impact character throwing a molotov at a tank.
While companies like OpenAI are running at a loss, the real money is in building up and building out these massive data centres. This is a hardware racket, not a software one, and business is extremely good, having bounced back from the chip shortages of a few years ago. I'm reminded of someone on cohost going on a rant about how computer storage manufacturers giving zero fucks about consumer markets. It's pretty much that kind of a situation. Nvidia only cares about OpenAI far enough as OpenAI gives tech giants reasons to buy chips and storage, and techbros will endlessly chuck money at OpenAI because they have been sold the lie that if we only add a few billion or trillion more transformers, we'll get our AI god.
Like no shit, bouncing off some of the discussion on AI I saw going around today, I feel like this has transitioned from standard Capitalist techbro territory, to an AI religious cult. They will keep throwing money at this because they believe immortality is within their grasp, and nvidia is more than happy to serve up the chips to let them chase that dream to their graves.
The only chance we have to stop this environmental disaster (the keynote barely made mention of just the cooling requirements for their new chip architecture) is legislation. We need actual environmental protections, actual environmental agencies that have teeth and can bite hard.
if all that compute was mostly being used to train these little critters how to walk, i'd be okay with that. I could live with that. I want to live in a world with cute little robots that aren't gumdrop cop roombas

































