• he/him

I occasionally write long posts but you should assume I'm talking out of my ass until proved otherwise. I do like writing shit sometimes.  

 

50/50 chance of suit pictures end up here or on the Art Directory account. Good luck.

 

Be 18+ or be gone you kids act fuckin' weird.

 

pfp by wackyanimal


 

I tag all of my posts complaining about stuff #complaining, feel free to muffle that if you'd like a more positive cohost experience.

 


 
Art and suit stuff: @PlumPanAD

 


 
"DMs":
Feel free to message as long as you have something to talk about!


That's right, prepare for some full on "back in my day" posting.

So there's some new Intel CPU out and they basically just changed the number and some voltage curves. This is the culmination of many years of the custom built PC becoming much much much faster, but also significantly less interesting. It got me reminiscing about the gOoD oL dAyS and I figured I'd share that here. I was VERY tempted to actually make this a Project, go get a ton of old hardware and run lots of benchmarks but I don't have money or space to plow into fancy trash right now.

In fact, I'm not even going to put in the research to make this a High Effort Post. I'm going to go 90% off of memory, and 10% finding links to articles from back when that have benchmarks. If any of this sounds wrong: It probably is! You've been warned! But let me know about it in the comments below :eggbug-wink:

And shares are off because I wanted to write this more than I want notification to go up.


Overclocking (, libre)

For the longest time, how fast your computer was related strictly to how fast your CPU speed was.

Actually that's an outright lie. If you wanted your computer to feel faster you needed faster storage. In the late 90s and early 00s that meant either a sketchy built in hardware RAID 0 that came on your nicer motherboard, or a hilariously expensive SCSI controller and server drives. But CPU performance was also a thing.

The thing is, in the era I'll be discussing of the 00s, this meant primarily single core performance. It wasn't until the end of the decade until you could reliably expect most software to use more than one thread. Yes some tasks are embarrassingly parallel and got that ability sooner but for the most part having more cores or even just more threads (HyperThreading was originally conceived just to try and keep the Pentium 4's gigantic pipeline filled against all odds) just meant having a system that would play better having more than one heavy application open at once. Games in particular were very slow on this front, and going for a faster single or dual core (depending on the year) was a common choice.

Quick side note, if you weren't around back then, we used to close EVERYTHING when we wanted to run a game. You shut down your web browser, maybe your chat clients, anything else you had open. You'd close ALL of that shit because any other application would steal CPU cycles and eat memory. The concept of a steam overlay running was not a thing.

So if you were a CPU maker, what was your product stack? Well it was usually a bunch of the same CPU with different clock multipliers, and usually some differences in cache size. That's it. And if the primary differentiating factor between a $150 CPU and a $500 CPU is the clock speed... then you start to see why overclocking was a big deal. On top of that, generational gains were substantial back then; it was common to see a benchmark chart where the cheapest new CPU was on parity with the most expensive from the last generation, at least for major updates (not just die shrinks). By the end of the decade we had affordable quad cores, but again most things used two threads at best.

There were many occasions where a CPU for one reason or another, would be dirt cheap and overclock like mad. They'd usually match or beat the performance of the highest end chips when done so. Here's a quick list of highlights:

The Celeron 300A, a $180 or so CPU that when overclocked was similar in performance to the $670 Pentium II 450, the fastest (desktop) CPU you could buy at the time.

The Opteron 144, a server CPU that was originally expensive but could later be had for under $200, even close to $100 in 2005. The later revisions could easily overclock to almost 3GHz, putting it on par with AMD's fastest single core chips at the time, which were still well north of $500.

The Pentium D 805, best summed up by the headline of this Tom's article from 2006: "A 4.1 GHz Dual Core at $130 - Can it be True?" It was true, and in a lot of cases that was faster than any stock Intel CPU, even the $1000 ones.

The Core 2 Duo E4300, seen here as a ~$170 would-be-1.8ghz chip outperforming everything at 3.38GHz. With a good platform underneath these would happily do 3.6GHz, literally doubling the clockspeed. Funny thing, they mention a planned price drop in the article too, said price drop being months away.

The Core 2 Quad Q6600, arguably the last great overclocking bargain. These launched hella expensive ($600+???) but within 6 months they were under $300 and had a new stepping that overclocked very well. Most people bought these as $180 chips in 2008, and most ended up running at 3.4-3.6GHz. Per usual, they outperformed most non overclocked CPUs at that speed.

And these are all just, memorable highlights. In this era there were a LOT of CPUs that could gain the better part of 1GHz clock speed from overclocking, from chips that ran 2-3GHz stock. Performance isn't linear with clock speed but there were big gains to be had. The key to all of these, if you read into the articles, is that it wasn't just a "set the FSB and go" thing. You had to know what you were doing, plan your parts out ahead of time, and spend a bit more money on everything that touched the CPU; motherboard, ram, cooler. And they used a lot more power. You usually ended up on some forum asking questions and learning things, and reading review articles about what parts you might buy.

This may not be ideal if you just want a faster computer, but as a hobby it was fantastic. You learned things, and got something from it. It introduced you to people (for better or worse) and gave you something to talk about. And if you played your cards right, you saved a bit of money... or got something a bit faster for the money you spent.

2011: Sea Change

2011 ended up being substantial for PC building for three reasons: Sandy Bridge, Bulldozer, and SSD prices. Let's talk about those in an entirely different order.

Bulldozer was AMD's new architecture, released in 2011. If you thought AMD was always the cheaper, obviously not as good option, it's because of 'dozer. They tried to make cores with two integer units and a single floating point, but market core count on the total number of integer units. Everyone hated that, but worse still it didn't matter, they were just slow. AMD had been somewhat competitive with Phenom II, had a dud with Phenom I, but had been at least price competitive if not also outright peak performance competitive before that going back to the original Athlon. The company whose existence for years was to basically give Intel competition slipped on a banana peel and landed flat on their ass, and that lack of competition would have a large influence on Intel's actions.

SSD prices coming down was probably the most important thing to happen for PC performance in the past.... at least 20 years. By 2011 you could reasonably consider putting an SSD in your computer big enough to hold an OS and some games even if you weren't filthy rich. This usually meant 80-120GB for 200-some-odd dollars, but was well worth it. All that stuff about SCSI and on board RAID from earlier? Useless now, just get an SSD. A formerly complicated and expensive solution was wholesale replaced with a simple one, if you had the coin.

Sandy Bridge was Intel's third Really Good new architecture in a row. Pentium 4 (read: Netburst) was a literal hot mess, but Core 2 almost immediately put them back on track and the chips got even better with newer steppings (see above). Nehalem was another huge performance jump but despite the "consumer" socket 1156 chips being a thing, I mostly remember people opting for the more expensive socket 1366 setups complete with triple channel DDR3 to run an i7 920, because they were so obnoxiously fast at the time. This means that despite there being some fuckery going on with the first gen "core i" CPUs on 1156, it wasn't that big of a deal. K skus? Built in IGPs? Whatever, just get the 920. Sandy Bridge was a number of things. It was fast, not quite as noticeably as Nehalem was but still enough faster to matter. They were power efficient, like obnoxiously efficient compared to Nehalem. And they completely locked down overclocking. If you wanted to overclock, you had to buy a "K" sku, and those started at a bit over $200.

So what does it matter? You have to pay more for the CPU, but the K sku has an unlocked multiplier, all you have to do is tell it what speed to run and that's it! No worrying about bus speed, memory dividers, none of that crap. Yeah you had to buy a motherboard with the "Z" series chipset, but you would have had to buy a better motherboard previously to get a good overclock anyway, the price difference wasn't THAT significant. And they overclocked really, REALLY well. Just like the overclocking hits of old, the Sandy Bridge K chips would easily add another 1GHz to their stock speeds. I recall stories of particularly good chips hitting 5GHz on air for daily use even, with the memories of Intel's failed 4GHz Pentium 4 launch still fresh. It was impressive stuff, and I think many would consider those chips some of the best ever made, not just for how fast they were at launch but just how long you could eek them out. A heavily overclocked 2600K wasn't a huge burden for most people even 6 years after launch, which wasn't something that had ever really happened since new CPUs started coming out on a regular basis.

Tick-Tock: faster and duller

But everything was starting to converge. It made more sense for everything to be easier to set up, especially since that convenience can be charged a premium. Every year Intel would release a new set of chips, a bit faster than the old ones and a tad more expensive. You could get an i7 with quad cores and HyperThreading, or spend a bit less for an i5 without the HT. There'd be a new set of motherboards, occasionally a new DDR standard, and the best options became pretty straightforward. None of the new CPUs overclocked quite like Sandy Bridge did, but they were faster anyway so who cared? Later on some people bemoaned the lack of core count increases, but that didn't matter for games and Intel was happy to sell you up to their "HDET" platform, spiritual successor to the 1366 days, if you could afford it.

By the time AMD finally pulled their thumb out we had basically gotten four identical Intel CPU launches. Same lineup, similar pricing, slightly decreasing performance gains. AMD brought a lot of competition particularly in core count, the one thing you can't do shit about as a power user1, but the other thing they brought was unlocked CPUs. Almost all of their chips were unlocked, but the catch is they had very little headroom. They were sold running at about the optimal speed already, you could squeeze a bit more out, particularly for all core performance, but it was a lot of work for not a lot of reward in most cases. Within a few generations they had brought over a CPU that basically overclocked itself on the fly just like GPUs had been doing for years, and for the vast majority of people CPU overclocking became "would you like it mild or hot?" and the CPU would do the rest of the work. Intel managed to find their own banana peel right as AMD came back, and eventually were forced to sell CPUs also running close to optimal, later just outright maximum, clocks out of the box. They also had to implement their own self overclocking system, with basically one option to change if you want it a bit spicier.

And don't get me wrong, this is all ultimately good if you want a faster, easier to use PC. The chip being able to clock itself on a fly is just better than a manual overclock at the end of the day. It knows how hot it is and how much current it's drawing at any given moment and can adjust on the fly, instead of me turning on Prime 95 and running it overnight to make sure it doesn't overheat or crash. Instead of getting an idea for what speed to aim for based on forum posts, it knows what speeds it can do because the people that made the CPU put that in there. There's no real question of how fast it might go because the quality control is so good now that every CPU in the same SKU can be expected to land within 100mhz of each other even with dynamic overclocking.

But as a hobby, picking which parts to put in your computer now has become mostly just what you think looks good and how much you're willing to pay. If you pay more, you get more performance. You can overclock and tweak and fiddle, but gains are small. Computers are already so fast that it takes the best and brightest minds to make software so incredibly overburdened to cause any substantial trouble for modern CPUs; that magic 10% gain from finding every perfect memory timing and voltage level matters less than it ever has.

But such is life sometimes.


  1. Yes I know about the AMD core unlock days, that breaks the flow.


You must log in to comment.

in reply to @plumpan's post:

If you wanted your computer to feel faster you needed faster storage.

Kinda. Anecdotally, I only overclocked two of my computers: a Celeron 800 (Coppermine) and Athlon XP 2000+, and mainly because I wanted better fps in games and better compilation times for half-life maps. From top of my mind, squeezing a 1 GHz from a 800 MHz Celeron chopped 30 minutes from 3-hour compilation time on Zoner's Half-Life compilation tools.

Overclocking Athlon was less useful, sadly. It only squeezed 2-3 more FPS in Half-Life 2, coupled with me running it on Radeon 9600. Since that, I've never overclocked anything, and actually actively downclocked my recent Ryzen 9 laptop because i wanted it to run less hot.

Don't forget, builds used to just be... More difficult. To put any of it together.

No ihs meant that to this day, 20+ years later, I've held onto my first overclocked cpu. It still worked when I pulled it, but one of the cpu spacer pads was half fucking melted.

Flashbacks to trying to fit Socket A/370 coolers with the spring clip

But yeah basically this whole writeup is "yeah things are better now but they're less fun". Somewhere, there's a weird excitement about risking your motherboard trying to attach a cooler.

Well, don't worry, if the new AMD socket screws are anything to fucking go by that's never left D:

christ that tension made me think I was going to shatter the board but it's apparently normal

It was on an AM4 board, stock cooler for the 5600X. Just felt a little spicy, but then again, the last cooler I installed before that was probably a Q3350's or something

YEAH I managed to pull that out of the brain folds, I wasn't into the AMD gear at the time but I remember hearing about that one a lot, and it sounds like it was a banger. Very similar to the E4300, just a massively underclocked chip being sold for cheap.

The Conroes that predated the 4300 also just... ruled. I had one of the E2160 "Pentiums" of the Allendale wave, but also a 6320 that replaced my Newcastle AMD64 3500+. Good times. I've still got a CPU-Z screenshot of how I had it running, ha.

🤓 the 4300 and 2160 actually launched around the same time, and the 6320 was a bit later. If memory serves the early early conroe chips didn't clock super well, although they were good performers. The later chips like the e4300 between new steppings and less cache did a lot better for overclocks. The 2160 looks like it was basically just a 4300 with 1MB L2 instead of 2. Bet it clocked good.

Ahhh, must still be mixing it up with something. I had a full and partial cache version, but the full ended up dying when someone knocked a bottle of mountain dew into my case.

If you'll excuse me, I'm going to go die of cringe from that being the most "gamer trans woman in 2006" thing I've said in years