Matytoonist

Bnnuy brainrot(?

19yo argentinian cis guy
Things i like range from art, to software, to DIY electronics, and whatever current project im having

big button that reads "powered by linux" featuring Xenia's left eye from the original drawing om the left
button that reads "bunny browser" parodying the netscape logo with a rabbit siluette


cathoderaydude
@cathoderaydude

The above picture is a screenshot from a videogame running on EGA, the graphics standard that made "proper games" possible on the IBM PC. It was a monumental addition to the PC's capabilities, and if you've played any significant subset of 80s and early 90s PC games, you've seen it plenty. So, you might notice that the image above looks... a bit off. This is because, as much as EGA enhanced the PC platform, there was an entire dimension to its feature set that almost nobody ever saw, because of a... highly questionable decision on IBM's part.


cathoderaydude
@cathoderaydude

I got a comment on this post earlier, which I wrote eons ago. That led to me rereading it... and discovering a number of severe factual errors, as well as the fact that I uh. I. I just never made my point?? Originally?? Like?? The original post had no point! I failed to actually make the conclusion that the entire post was written to deliver. So I rewrote it completely and I assure you, it's a much better read now.


You must log in to comment.

in reply to @cathoderaydude's post:

I love this write-up so much! I became kind of obsessed with EGA and digging up its history a few years back. It sent me down the path of making a game engine that emulates the EGA color limitations down to storing the framebuffer in bitplanes. I don't think I ever really understood that the RrGgBb wasn't even available in the non-350-line modes! That clears up a lot of confusion I had :host-nervous:

The irony is though, unlike EGA, the MCGA/13h mode got a TON of use over the years, well into the late 90s. Fricking DOOM ran in MCGA mode, it apparently still didn't even use the VGA page swapping. It was kinda the standard base graphics mode for a lot of early 3D, and I guess it's much as you pointed out with the hires EGA: 320x200 is just a lot fewer pixels to push at once when all you got is a CPU to do it.

Weren't MCGA and VGA pretty much developed concurrently? If so, I feel like it's more that this was the "they can have little a VGA, as a treat" mode rather than vice versa - but I admit I haven't looked into this so maybe I do have things backwards!

Looks like you might be right, yeah. But notably, VGA was limited to the 286 and higher models of the PS/2, which were eyewateringly expensive. A sidebar in an article in Byte about the initial line points out that the Mac II worked out to be cheaper than a competitively specced PS/2. So I wonder if another reason MCGA mode was so popular is because the base model 25/30 were the only ones most users could afford.

We'd have to go trawl through the fossil record to confirm, but here's a total asspull: neither really got a lick of use by 95% of users... until the VGA got cloned in cut-rate $300 cards a couple years later and became the universal default for clones, and in turn the lowest common denominator target for game developers - and at that point this particular mode would be the most attractive for perf reasons. Could be total BS - to confirm it, would have to crossreference the number of "MCGA"-mode games made in 88-89 versus magazine articles saying "VGA Is Finally Affordable!"

That's a good point, and one I don't really know enough to say one way or another. My vague sense of the history is the PS/2 line didn't actually do that great, and I recall even being warned off getting a Model 25 at one point because the performance was somehow singularly ass even for an 8086 machine of the era. So it's very possible that it in itself didn't have the influence, and 13h just kinda took off on its own merits and it was easy enough to backport without using the paging so lots of games did it as an afterthought.

I often link people to this post when talking about EGA, a normal thing to do.

Something that has always confused me a little bit is the assertion that the 64-color rgbRGB EGA colors were only available on the 350-line graphical mode and that the 200-line modes required the 16 RGBI palette. It's confounded me some because that doesn't seem to be true? Everywhere I've looked, it seems that you could always set the 16 palette registers to any 6-bit values.

I wondered today, though, that maybe the reason has to do with the CGA-compatibility? If you changed the palette in 200-line mode it would work but only on an EGA monitor? so the colors wouldn't work when sent to a CGA monitor? In that sense perhaps it made sense to never modify the palette in 200-line mode because your game just wouldn't work on non-ega monitors.

Would love to know if there's anything I'm missing there!

Generally speaking, with the hardware of this era, the idea that you can't do something doesn't exist; You can program the registers any way you want, because the card doesn't actually have a processor on it, it just has logic gates. This is why it was possible to blow up a monitor by setting the wrong mode, because your card would happily send incompatible timing, even in the early VGA era.

When someone says that you can't use those colors in 200 line mode, I think the point is not that the hardware won't permit it, but that you won't get useful results. The extended signaling lines simply don't exist on 200 line displays, so if you try to display the extended palette, you're going to get unexpected and probably illegible results; and if I recall correctly, the enhanced displays disable the extended colors in 200 line mode, for compatibility.

But with that said, would I be surprised to learn that you can use the extended colors in 200 line mode on an enhanced monitor? Absolutely not, lmao.

Aaaaah that makes perfect sense! Because yeah I was even thinking of your proposed example of having an in-game or on-install user configuration for what type of monitor you're connected to (from what I can tell trying to detect it is a fool's errand) and then ship assets that work in rgbi palette but then can optionally do some fun palette switching in-game in the ega-monitor-mode... I would be super interested to try and see if that was ever done

I was always like 95% understanding everything but it always bugged me that I didn't completely get that last bit. Thank you for indulging me!

And right you are about the Enhanced Color Display! From the hardware reference:

The IBM Enhanced Color Display is an advanced color display
capable of operating in two separate modes. Mode 1 is a 16 color
~ 640 by 200 overscan mode with a horizontal scan frequency of
15.75 kHz. Mode 2 is a 64 color 640 by 350 mode with a
horizontal scan frequency of 21.8 kHz. Both modes are
non-interlaced. The monitor determines which mode to operate in
by decoding the vertical sync polarity.

200 line mode maps the 4 input bits directly to the RGBI palette.

ah, terrific, I remembered correctly, haha. rare for me

this behavior always seemed odd to me, but I think the reason is that, if you interpreted the primary RGB lines as they're used in enhanced color mode, then graphics designed for 16 color mode would come out dim and washed out. The extended color bits don't modify the original full intensity colors, they redefine the pins entirely, so you'd get totally bizarre results.

Of course, the irritation is that they didn't just provide a switch to override this - but of course they would have worried about users setting it wrong, getting mad when software didn't look right, and refusing to accept that they were at fault.

Something I was wondering in the context of graphics... what WERE people using PCs for in the 80s? It's usually implied that in the UK they ended up being used mostly for games and people learning how to make them... I think. It's all well before my time but I wonder in the US if people were primarily buying and using computers, IBM PC based and otherwise, for... idk, spreadsheets and word processing. I'm not sure what the graphics would be for otherwise? Other than games, there's nowhere near enough color quality to do graphics stuff... right? I think? I really have no idea. Hell, did people just use 8088s until Windows came around???

Using a computer for games is so much different than anything else. Games don't really require shuttling data in and out, just user input and graphics + sound output. Pretty much anything else you have more data than you'd want to input by hand and, sometimes, more output than you'd want just on the screen too.

But honestly 80s computing is so foreign to me that I might be wrong about all of that!

in reply to @cathoderaydude's post:

The above picture is a screenshot from a videogame running in EGA - that is, the graphics card that made "proper games" possible on the IBM PC. It was a monumental addition to the PC's capabilities, but if you know old PC games at all, you'd probably not have guessed that this was EGA at all. You'd probably think it was VGA, because even if you've played hundreds of EGA games, you probably never saw one that looked like this.

I maybe have boiled my brain by reading about EGA this week for the Commander Keen video, but your opening completely lost me. Like, I don't know if I've crossed the other side of 'knowing too much,' but this looks like every other mid-effort EGA game I can bring to mind.

it's entirely possible i boiled my own brain in different ways, but when I fired this game up randomly in dosbox, i yelped when i saw the palette, since my neurons are programmed to know that none of those colors are possible in "normal EGA." i can easily believe that i am the only person who would have reacted this way

I'm glad for the rewrite mainly because I didn't follow you yet back when you posted the original, so now I get to read it and enjoy. Hearing your voice as I did, as of you were presenting it in video form. The B-roll was provided by memories of the 486 we inherited from Dad's employer at the time in 1995, an upgrade over the green screen IBM PC we had and used for a decade before that.

I'm considering writing a small game in DOS and I was doing research and it is surprisingly hard to find concrete technical details on this mode. I think 16 colors ought to be enough for anyone provided you can pick which 16 so I am considering using it.

The book I have, The Programmer’s Problem Solver 2nd Edition by Robert Jourdain, has examples for setting the EGA palette but claims that you only have sixteen possible values to stick in the sixteen palette registers. All the examples are using four-bit values. Which feels like it is of limited utility — it talks a bit about the effects you can achieve with palette tricks but it’s certainly not using the full six-bit range.

with 350-line modes being moribund almost immediately and universally, I wonder if they're taking the approach most people did and defining "EGA" implicitly as the 200-line mode and the CGA-compatible palette. The card probably allowed you to remap the palette even in that mode, since it would still enable color cycling tricks, but you'd be limited to RGBIs 16 colors unless you explicitly set the expanded color register (I'm guessing)

Honestly, given how ham-handed the rollout of EGA was the fact that it had as much of an impact as it did was astonishing. But I guess it's a testament to how basically good the initial premise of the PC was that it managed to win in spite of almost every early decision IBM and Intel made...

That's just it, right? One thing I think is undeniable is that if IBM hadn't produced the PC, someone would have. Maybe it would have taken another year or two, but the fundamental idea was solid and straightforward. The PC was not that innovative, nor that high-end - it wasn't a minicomputer, like the "real computers" of the era." It was more like, "what if a home computer... wasn't built like shit? what if it wasn't built down to an austere price point?" It was a PET with more RAM, a good keyboard, a metal case, a less-outdated processor - but still, fundamentally, a consumer-oriented 1970s design.

That initial raft of late 70s computers established very firmly that there was a market for computers in the home, but the technology was so expensive that nobody was willing to take the plunge of developing a really high quality one (and software and peripherals) with a commensurate price tag until they were absolutely certain that it was going to sell. IBM had the money and clout to make that bet before anyone else was ready, but it was going to happen.

IBM really wanted to fuck it up, they really wanted to ruin their bid for the market, but the existence of the PC at all was its primary value, and the open expansion standard made it possible for third parties to patch IBMs failures. It was slated to win by default, and IBM simply failed to ruin it despite their intent.

Yeah!! Exactly that! The like, one actual interesting thought that went into it besides "home compute rbut not?s uck"? - and as you say, someone would have done this! Maybe Osborne! Maybe Kaypro! - was the idea that you could build a Business Machine to compete with the CP/M boxes but:

  • Using a processor that could sort of address a full meg of memory.
  • With some minimal concessions to the home computer market (joystick port, something resembling bitmap graphics) so that they could sell people the same computer for home that they use at work

IBM struck gold and then tried their damndest to make it fail by being IBM about it! It took fucking Compaq to drag it over the finish line by actually releasing a 386 machine when IBM was still trying to figure out how to sell 286 machines and kill ISA (and thereby murder their own cash cow!)

As a kid in the late ‘80s, who owned an Atari 600XL (which I still have!), it seemed clear to me that PCs were the future. All the magazines were focusing on them, that sort of thing. But they were so pathetic on the games front. When I visited friends with PCs it was a mixture of jealousy and pity.