
member of @staff, lapsed linguist and drummer, electronics hobbyist
zip's bf
no supervisor but ludd means the threads any good
a system with an i9-13900K and an overclocked 4090 draws at least 1000W
good lord. at least the 7950X is decently efficient for its huge TDP, but christ
yeah I helped @ror build a new computer literally last night, and on the way home I was reading reviews for the 7700X talking about "but just wait for raptor lake, which could blow it away!" and man, not like this lmao
does this happen in cycles? where they push a technology to its hottest power hungry limits, until the only way to 'innovate' is to make a more efficient chip, and then that tech takes over, and the cycle continues.
but also im sure they are not concerneed with power draw so they can sell power efficient chips separately for people who do care.
I think the reverse is possible, too:
They start on a new process node, and don't understand it fully. In order to get good performance, they push the power budget up to the biggest possible one, and overcome ineffective use of the new tech via power.
Then, next generation using the same underlying process node, they've understood the node and optimized it enough to be both powerful and very efficient.
But since they've gotten everything out of the node they can, the next generation has to push to the next process node, which they don't understand enough to make efficient yet, resulting in another inefficient product. Etc.
I think what you've described happens too, but it's not every case. I'd find it interesting to make a graphic or something with all the CPU/GPU generations with power use, power efficiency, and underlying process node to see how this lines up in general. There's also definitely the choice to change, like, default power budget depending on how the product is marketed (like, if you can't be the most powerful GPU, lowering the power budget and being the most efficient one instead could be a good ploy)
off topic but kind of confused how the 5800x3D is pulling less power than the 5800x but i really dont know anything about processors
according to anandtech's reviews at the time, it looks like it might be because it boosted to a cap 100mhz lower than the original 5800X (but was faster clock-per-clock because it had more cache) https://www.anandtech.com/show/17337/the-amd-ryzen-7-5800x3d-review-96-mb-of-l3-3d-v-cache-designed-for-gamers/5
ah ok that makes sense then. glad to see that i am at least, just barely, in the bottom half of this chart with my 5800x