if you turn off the hardware power limits, it happily turns itself up to 6ghz, burns 400 watts, and thermal throttles, even with a 360mm x 60mm radiator
https://www.youtube.com/watch?v=rKE__VyrPII
this is a computer for nobody

member of @staff, lapsed linguist and drummer, electronics hobbyist
zip's bf
no supervisor but ludd means the threads any good
if you turn off the hardware power limits, it happily turns itself up to 6ghz, burns 400 watts, and thermal throttles, even with a 360mm x 60mm radiator
https://www.youtube.com/watch?v=rKE__VyrPII
this is a computer for nobody
a system with an i9-13900K and an overclocked 4090 draws at least 1000W
good lord. at least the 7950X is decently efficient for its huge TDP, but christ
yeah I helped @ror build a new computer literally last night, and on the way home I was reading reviews for the 7700X talking about "but just wait for raptor lake, which could blow it away!" and man, not like this lmao
does this happen in cycles? where they push a technology to its hottest power hungry limits, until the only way to 'innovate' is to make a more efficient chip, and then that tech takes over, and the cycle continues.
but also im sure they are not concerneed with power draw so they can sell power efficient chips separately for people who do care.
I think the reverse is possible, too:
They start on a new process node, and don't understand it fully. In order to get good performance, they push the power budget up to the biggest possible one, and overcome ineffective use of the new tech via power.
Then, next generation using the same underlying process node, they've understood the node and optimized it enough to be both powerful and very efficient.
But since they've gotten everything out of the node they can, the next generation has to push to the next process node, which they don't understand enough to make efficient yet, resulting in another inefficient product. Etc.
I think what you've described happens too, but it's not every case. I'd find it interesting to make a graphic or something with all the CPU/GPU generations with power use, power efficiency, and underlying process node to see how this lines up in general. There's also definitely the choice to change, like, default power budget depending on how the product is marketed (like, if you can't be the most powerful GPU, lowering the power budget and being the most efficient one instead could be a good ploy)
off topic but kind of confused how the 5800x3D is pulling less power than the 5800x but i really dont know anything about processors
according to anandtech's reviews at the time, it looks like it might be because it boosted to a cap 100mhz lower than the original 5800X (but was faster clock-per-clock because it had more cache) https://www.anandtech.com/show/17337/the-amd-ryzen-7-5800x3d-review-96-mb-of-l3-3d-v-cache-designed-for-gamers/5
ah ok that makes sense then. glad to see that i am at least, just barely, in the bottom half of this chart with my 5800x
frankly ridiculous that upgrading your CPU from an i5-12500k to an i9-13900k will mean you have to buy a new PSU, a new motherboard, probably a new CPU cooler and a new, more airflow-ey case to fit it in. and it will be so, so much louder.
even funnier: this awful thing has 16 "efficiency" cores and only 8 big cores
the first skus they released that had the big-little design were a disaster, because they had AVX512 XUs on the big cores, so you could either disable the small cores or disable AVX512 support. if you wanted both your threads would get cross-scheduled spuriously and crash with unsupported instructions, because they did not implement microcode for AVX512 emulation.
intel's solution to this was to send out an """update""" that permanently disabled AVX512 on every performance core.
the 13900K is listed as not even having AVX512 support at all, lmao
i wonder if the silicon is still there
intel is getting thrashed by AMD so hard that they're basically doing the equivalent of running the 100 meter dash while doused in flaming kerosene just to put up good benchmarks. it would be sad if they didn't suck so much
I imagine a jilted ex-boyfriend going "fine, so the mobile market doesn't like me anymore? fine! Fine! FINE!!!"