NireBryce

reality is the battlefield

the first line goes in Cohost embeds

🐄 I am not embroiled in any legal battle
🐦 other than battles that are legal šŸŽ®

I speak to the universe and it speaks back, in it's own way.

mastodon

email: contact at breadthcharge dot net

I live on the northeast coast of the US.

'non-functional programmer'. 'far left'.

conceptual midwife.

https://cohost.org/NireBryce/post/4929459-here-s-my-five-minut

If you can see the "show contact info" dropdown below, I follow you. If you want me to, ask and I'll think about it.


cathoderaydude
@cathoderaydude

i'm probably gonna go into a lot more detail later since i have a lot of thoughts to sort out before i make a video about this, but: these are EGA monitors. true, actual EGA, not CGA, and that makes them very tough to find; I had never seen one in my life, even a photo, until a couple days ago. as a result I am learning... New Information I Did Not Have.


cathoderaydude
@cathoderaydude

A while back I wrote a post about why EGA's extra features never mattered, which was kind of a wet fart because in the end the actual answer was "the new features sucked." Nonetheless, I had something stuck in my craw about it which I've been trying to run down. i will summarize and then explain what is going on with these very cool monitors


cliff notes

IBM put out the CGA card in 1981. It could do 320x200 or 640x200 graphics, as well as 40 or 80 column text. These modes were all in color, and drew from a 16-color palette:


Unfortunately you were not able to use these colors freely. The 320x200 mode could only use one of two fixed, 4-color palettes which were both absolutely putrid. The 640x200 mode was limited to only two colors, though you could pick them from the full palette. Textmode could also use all 16 colors freely. These were, to be fair, absolutely amazing for a home computer in 1981; nothing else came close. It still sucked though.

In 1984 IBM released the EGA card, which could do a lot better. It could do the same 640x200 and 320x200, as well as a new 640x350 mode, all in sixteen colors, which were in turn drawn from a palette of 64 colors.

These new capabilities went almost totally ignored. The 200-line modes were used heavily, but only using the original 16-color CGA palette. The 640x350 mode was almost never used. Neither of these are actually tragedies; 640x350 was so many pixels that most contemporary PCs would have struggled to update the screen at any decent rate, and the new palette was terrible.

grid of the 64-color EGA palette

It looks rad at first, but if you look closely, it's awful. There are almost no useful skin tones, no new greys, few pastels, and a bunch of useless saturated greens and purples. It sucks.

Still, there were new colors in it, and some of them were desirable, and the new high resolution mode absolutely had applications. Both went almost totally ignored however, because IBM made a total mess of the monitor interface.

IBM introduced a new 5154 Enhanced Color Display for the EGA. This was necessary both because of the new color capabilities, and because monitors in this era were very simplistic. Instead of detecting the incoming signal format and adapting their scan rate and geometry, almost all monitors were hardwired for a specific frequency and worked with no others. In the late 80s, "multisync" monitors would be introduced that could sync to any signal within a range of frequencies, but in 1984, the only practical option was a "dual-sync" monitor, one that supported exactly two predetermined rates. In this case, the old 200-line (15KHz horizontal sweep) and the new 350-line (~22KHz horizontal sweep.)

IBM wanted the new monitor to work with the old CGA card, and for the original 5153 Color Display to work with the new EGA card, but they didn't want to include any proper method for the card and monitor to identify themselves, nor what color mode they desired. This was a real problem, because color depth was complicated in this era.

Modern monitors are all based on the VGA pattern, which is incredibly simple, basically as simple as a display can get. Red, green and blue color components are sent as simple analog values, along with horizontal and vertical sync pulses to allow the monitor to identify the size and frequency of the signal. Because the values are analog, they have infinite precision, so even though the original 1987 IBM VGA card only supported 256 colors at most, the monitor sold with it can be connected to a 2024 device which outputs 16 million colors. The problem is that this requires the video card to have a RAMDAC, which is very expensive and possibly hadn't even been invented in 1984.

Instead, CGA and EGA used "TTL video", where the image is sent to the monitor as digital, on-off values. TTL refers to the voltage, roughly 5 volts for on, roughly 0 for off. CGA used 4-bit TTL, also known as RGBI, which has bits for red, green and blue, plus an "intensity" bit which halves the brightness of the whole pixel. Add up all the combinations and you get the 16 color palette above. EGA switched to a 6-bit format, which IBM called "RrGgBb", and which basically gave each color channel its own intensity bit, expanding the palette to 64 colors.

IBM needed the 5154 to know whether the card was sending 4-bit or 6-bit color, and which resolution was in use. They could have solved this many ways; they chose not to.

Instead, they crushed both questions together into a single bit: resolution and color depth are determined by the polarity of the sync signal. If the card sends positive sync pulses, the monitor switches to 200-line mode and 4-bit color. If they're negative, the monitor expects 350 lines and 6-bit color. This means the other combinations are impossible.

Of course, they aren't. The card (by many reports, none very substantiated) will happily output 200-line video with 6-bit color. It's the monitor that won't accept this, since there's no way to tell it what you're doing.

what i'm doing

IBM didn't provide a method to tell the monitor you're using a mixed mode, but that doesn't mean you can't just do it anyway if the monitor allows it. For years I've heard that there were monitors that did this: they used the sync pulse to determine the resolution, and by default would follow IBM's rules for interpreting the color, but then provided switches that let you override the bit depth.

I've been wanting to play with this for eons. I mean obviously I want to do a video about it - it's not totally uncovered, but it hasn't been covered much, and most people have misconceptions about it. And, of course, I'd like that video to include some EGA Trutherism, where I talk about how the expanded palette totally could have been used.

The trouble is, this absolutely requires Real Steel. Emulation is 100% untrustworthy, so I need a real EGA card and monitor. The card is rare, but obtainable (I have a clone which works fine; I need an original IBM card to prove my points though.) On the other hand, I've never, ever seen an EGA monitor. I've searched on and off for years and came up with absolutely nothing; "switchable EGA display" etc are anti-google phrases, they produce nothing but useless sludge, half-right forum posts, and the same posts from stackexchange and os2museum and whatnot. No model numbers, nothing that can be looked up on ebay. Mind you, I've never even seen an EGA monitor, period.

I have always gotten the strong impression that nobody bought enhanced EGA monitors, because there was no point. The new 16-color capabilities of EGA operating in the old 200-line modes enabled much better looking software on the monitor you already had, so people just bought the card and used their old monitors. I'm no longer so sure this is true, but in any case, almost every single monitor I've ever found on a "to be shredded" pallet at the ewaste store has been VGA, CGA or monochrome. I have never once seen one that was provably EGA.

Of course, it's normally impossible to tell, because a basic EGA monitor looks EXACTLY like a CGA display until you look up the model number (most of which produce zero results) or plug it into an EGA machine and run EGA software, which is generally very hard to find. Again, this is an ungooglable topic. Everyone thinks that "EGA software" means "320x200 with 16 colors," and I had to spend several days on it to even put together a short list of titles.

So after years of finding nothing, just nothing at all, a friend hits me up the other day and says "hey man check this out," and it's a link to an ebay post. He's been really into the BBC Micro (and relatives) lately, and the monitor that was usually sold with those was called the Cub, and was made by a company called Microvitec (Micro-vee-tech.) They are apparently legendarily durable and high quality, so he's been getting really hard into researching them to find out what variations might exist.

In this case, he'd gone to ebay and simply looked up the company name on a whim. It produced almost no hits, but one of them was a listing for a Definition 12L71DNS3. We had never heard of this, but the manual popped right up, and boy howdy, is this a special little unit. I'm sorry to report that me and my friends acted like awful little vultures and bought them all already, and none are likely to show up again soon, if ever, for reasons I'll explain.

The fortunate thing is that largely this does not matter, you probably don't need one, it's not a good replacement for e.g. a PVM for most people. But I wanted to talk about it, because this category of object is absolutely and completely forgotten, let alone a specimen this dank.

the Definition

The Definition is the black-framed monitor in the pics, and it's hard to even know where to start with it. This thing does so much.

So first, it's multisync. This is not that remarkable for the 90s and 2000s, but given that it was made in 1990, it's pretty wild. There were many other multisync displays at the time, such as the titular NEC Multisync (no bloody A, B, C or D, just Multisync) but almost all of them have been crushed and thrown into landfills.

The thing about multisync displays is that almost all of them were made after 1990. At that point, clones of IBM's VGA card were beginning to overwhelm the market, and since its lowest scan rate is 31.5KHz... that's as low as any of these will go. Many will go to very high rates, and that's great, but nothing ever goes below 31.5, because by like 1993 there wasn't a single computer being sold that didn't run at 640x480 or higher from the moment it was turned on.

The Definition, however, multisyncs from 36KHz down to 15. This is an incredibly useful range; 36KHz is well over 640x480 VGA, and 15KHz is AKA 480i/240p. So this has a lot of implications if you waste your time on dumb old 8-bit bullshit.

It also does both analog and TTL. This is not unheard of; the extremely common Commodore 1084 used with Amigas does this. However, that display is limited to 4-bit TTL; this is not. In fact, it supports more TTL formats than I knew existed.

It turns out that in addition to RGBI (4-bit) and RrGgBb (6-bit), there is also RGB (3-bit), which is simply the primary colors and their combinations, at maximum intensity. This is used by the BBC Micro, and maybe a couple other things, so naturally the Definition supports it, in addition to the other two formats. There are switches on the back allowing you to override various things about the color and signal format.

It also supports every sync format: H/V, composite sync, and sync-on-green.

When you add all this up, you get a device that can talk to anything made before a certain date. Here's a short list:

  • Anything that does 240p RGB (many home computers, modded game consoles, countless other things)
  • 3-bit TTL video (BBC Micro)
  • 4-bit TTL video (PC CGA, 200-line PC EGA, Amiga, Commodore 128)
  • 6-bit TTL video (PC EGA, 350-line)
  • TTL or analog video at 24KHz (PC8801/PC9801)
  • Interlaced video up to at least 576i (BBC Micro)
  • All of the above at either 50 or 60 hz (NTSC and PAL devices)
  • Analog video up to 800x600@56Hz (JUST shy of 60, but you wouldn't want to run 800x600 at this dot pitch anyway)

I know I've left some things out. It's an absurd list of capabilities, I know of nothing else that can do all this.

So why is this so flexible? Well, partly because Microvitec was just like this, but also because this very specific monitor was a custom design for a very specific purpose. We had to do some deep diving to figure this out.

The 12L71DNS3 was commissioned from Microvitec, probably by a company called Micrognosis in Danbury, Connecticut, on behalf of JP Morgan, to be installed at 60 Wall Street. Apparently these Micrognosis folks had a business where they would go to a business that was involved in the stock market and sell them an entire "trading floor" as a package deal, including (of course) a ton of monitors. That's right, these were custom-ordered stonks displays.

Based on a facebook post we dug up, these were probably part of an "800-seat" deployment. I have no idea what that looked like, but let's assume every "seat" got two monitors, and that they had spares for a couple years, plus some overhead displays, yada yada... they probably made less than 10,000 of these, and possibly no more than a few thousand.

Because they were intended for such a critical application, they are not just capable, they're also robust. The chassis is entirely steel, including the bezel. This is a 12 inch display, a pretty tiny guy; it weighs 40 pounds. It's built like a fucking tank, it's tremendously serviceable, and it even supports balanced fucking signaling. The manual says that you can feed it differential analog signals to prevent interference. I've never even heard of that. So this is an absolute god damn monster of a device.

So I order the Definition almost immediately, because it's the first EGA-compatible display I've seen in my entire life. It's $100 plus $100 shipping; this feels steep, but as noted earlier, the normal cost of a device like this is over $450 before shipping. I am making out like a thief no matter what happens here, even if it arrives smashed. I mean, given what it was intended for, it could have stonks burned so hard into the screen that it's useless. The seller obviously has no idea what it is, and there are no test images, but I am willing to find a replacement tube, that is how hype I am for this thing.

it took a week to get here and I fully expected it to arrive totally smashed, but instead it's utterly pristine. Not a scratch on it, barely any dirt, zero burn-in. It must have been a spare that never got put into service.

the AT&T (NEC)

The next day I walk into RePC, and sitting on a cart with a big X across its face is a monitor that just SCREAMS "i'm something weird" to me. I turn it around, and holy jumped up christ it has a 9-pin plug and switches on the back. It's definitely EGA. The marks on the screen are crude sketches of smoking toast; the screen has burn-in, plainly visible even with it off, so it's destined for the trash because people will not buy monitors with burn-in. I am not people however

I plug it in, and of course it works. I plug it into an EGA card and of course it has a clear and bright picture. It was literal garbage however, so I take it home for... $20.

I spend five years looking for one of these displays, finally spend $200 on one, and then find another for an order of magnitude less. Well Ain't She The Break's

So here's the thing: It's an AT&T CRT 329M. That doesn't exist; you can search anywhere you want online, look in magazines and manuals and old inventory lists, it does not exist. AT&T had other monitors with similar names, probably sold with various PC 6300s (rebadged Olivettis), but not a 329M.

I thought to look up the FCC ID; it turns out to be an NEC. So I've just pulled an incredibly rare Multisync out of the trash, and this one isn't beat to shit or deep canary yellow. This is a shocking reversal of fortunes.

This monitor is not nearly as well built as the Definition, it's just a typical plastic case, but it is a couple inches larger. It has burn-in as mentioned, bad enough that you can see it in the phosphor when off, but you really don't notice it unless you're staring at a blank field of solid color, so that's cool. It also can't do analog input, only TTL, and I don't know what it's sync range is. If it's actually just a rebadged Multisync then it could be quite wide, maybe better than the Definition, but I don't really know.

It does have the override switch for selecting TTL color modes, but it has something interesting going on. While the Definition has settings for 3-bit, 4-bit and 6-bit, the NEC instead has "8", "16(NEC)", "16(IBM)" and "64". Obviously the first, third and fourth represent the same modes as the Definition - but what's 16(NEC)?

Well, the most likely explanation is that it's short for NEC [PC88/PC98], because both of those systems also used TTL video - however, IBM's implementation of RGBI was slightly different than everyone else's. Taken straight, RGBI produces a pleasing array of primary colors and mixtures... except for "dark yellow", which ends up being a pretty unsatisfying color. So when IBM displays (and compatibles) detect that specific combination, it activates a logic gate which adds a bit of red to the mix, producing a brown instead of a yellow.

NEC's computers (and possibly others) did not do this trick; their dark yellow was just dark yellow. So this switch position cuts out the logic circuit and makes it behave as normal. I confirmed this from the manual for an NEC Multisync II, which has the exact same switch labels - so it seems pretty certain that this is just a rebadged Multisync.

Now, my friend who knows more about NEC PCs than me says that this seems odd, because the PC88/98 pretty much switched over to analog video years before this came out (it's marked 1988), but I wouldn't be surprised if NEC just decided to include it for backwards compatibility, Just In Case. It's odd that the Definition doesn't have this option though; I wonder if there's a DIP switch setting for it.

One more thing that the Definition lacks, though it's not one I necessarily need, is the MONO mode. This crushes all input to 1-bit, coercing the entire image into stark white, amber, or green. Cool in 1990; not so useful now, though.

so what's going on in the pics

okay so the first two pics in the first post are nothing special, but the other two are wacky hijinks.

I went to the studio today to try out the monitors. I had an ancient 8-bit ISA EGA card (a Twinhead EGA CT-8090) but I needed a PC, and (while this may shock you) I don't actually have a decently contemporary system handy. I mostly have pentium 3s and later, my older machines are weird stuff like the Eduquest. So I pull out a Pentium 133 Dell Optiplex and stick the card in that. It has onboard VGA and it's from like 1996, so I figure I'm going to have trouble unless I disable the onboard graphics. To my shock, it detects the EGA as addon video and selects it on boot!!

...and then proceeds to boot into Windows 98?? With EGA video???????

I'm really not sure what happened here! People say that Windows 9x will run in EGA mode if you copy over the EGA.drv from Windows 3.x and replace the VGA.drv in windows\system, but I hadn't done that, and more to the point, that's not what was happening. The bottom of the desktop was cut off and the screen was stretched vertically. I'm convinced that Windows THOUGHT it was running at 640x480, and the EGA card was just displaying a 640x350 slice. How? I have no idea. My Twinhead card came out in 1987; there is no possibility whatsoever that it had some kind of VGA support. Is the Dell BIOS somehow providing VGA emulation?? Some dark, forgotten PhoenixBIOS bullet point? I am completely baffled.

Whatever trickery is responsible for this is obviously not doing a very good job. I think the Windows interface is supposed to be double buffered, because when you do anything that updates the screen, it gets horribly corrupted every other frame. The corruption looks like the framebuffer is being read with a slight offset, you can still sort of make out what you're looking at, so I strongly suspect that when Windows tries to read from the second video page, it's being wrapped around in VRAM by the circuitry and hitting the first page again, but off by a byte or something.

In addition, nothing redraws properly. Windows leave trails, etc. and when the Flying Windows screensaver starts, the screen remains unchanged except where the windows are drawn, and they leave black trails. It's incredibly trippy.

I doubt I'll ever figure out what's causing this, and eventually I will go in and test this with the right EGA.drv and all will be well. It is intriguing of course to remember that this ought to work; the resolution will suck, sure, but the colors will actually be fine, because the Windows "VGA" palette is identical to CGA's 16-color RGBI palette. IBM made RGBI the default on the VGA, so that's what everyone defaulted to, so it's no surprise that where the screen isn't corrupted, this looks exactly as it should.

The images attached to this post illustrate two other things. First, here's a picture of EGA working as intended; that's a game called, uh, Ingrid's Return I think, and it's a text adventure that runs in 640x350 and displays static artwork at the top, which leverages the extended palette. Hooray!

The remaining images demonstrate how the switchable color palette on these monitors affects the signal. I'm running Chips Challenge, a 320x200 EGA game. The third image is how it should look, using 4-bit RGBI color. The second image has it forced into 3-bit, Beeb-style color, which basically means that all the colors are full intensity. Finally, the fourth image is interpreting it as 6-bit, which basically just means all the colors are sad and muted and wrong. The telltale is that white areas become blue; this always seems to happen for some reason.

And... that's it. That's what this all adds up to. As I said, nobody needs an EGA monitor, there's no reason, you almost certainly have no tasks for which it's the best solution. All EGA software will run perfectly on a VGA card on a VGA monitor, and there are almost no programs using the enhanced mode and even fewer using the expanded palette in 200-line mode. The only reason I bothered putting in the effort to get these things is because I was curious about a question that nobody on the internet would answer for me. This whole thing is just wankery. Thanks For Reading


You must log in to comment.

in reply to @cathoderaydude's post:

I'm so mad at myself for throwing away two in house moves in 2014

my current burden is a 21" ezio trinitron sold to me as working, that took an entire day plus back and hip injury to collect. when i got it home it has somehow every crt failure mode, just not all at the same time

i'm just about to wrap it up to take to my new place, because i know that even if i take over a year to test every component it will be faster than finding anything vga in public transport range

i hate this shit! i wish i was fuck you rich so i could just drop £500 on a minter on ebay

From the programmer's perspective, VGA is very very very much "EGA with some extra modes" (and EGA is largely "CGA with weird planar bankswitching logic to access more memory without taking up more address space"); pretty much all of the registers live in the exact same addresses in memory.

My best guess at what's happening here is that the EGA BIOS ROM is being ignored, and the onboard VGA BIOS is handling setting video modes (which was generally the one thing you'd let the BIOS handle rather than poking registers yourself, modulo fun hacks like Mode X). The VGA BIOS code would poke the appropriate registers to set things up for 640x480x16, and both the onboard VGA and external EGA card would see those values on the bus. Of course the EGA card would only understand, like, 80% of what was being written, but it'd likely ignore the other 20% and end up in some kind of usable video mode. After that it's just a matter of pushing pixels into a framebuffer, which, again, would be at the same address on both cards.

Weird stuff is likely to happen if you start using programs that read from video RAM or registers, especially if the VGA and EGA were to get out of sync about what exactly was there (perhaps due to different amounts of video RAM?). But reading from video memory for the purpose of actually looking at the value is so slow (and such a pain in the ass due to the weird planar layout) that programs will pretty much never bother, preferring to just keep the canonical copy of whatever they need in system memory and write to video only when necessary.

ahhh, okay, I'd had it in my head that their vram map wasn't compatible for some reason. This makes more sense, just a stack of happy accidents / IBM designed both cards and didn't bother making many changes where they didn't need to

Fun story: When I started digging in to DOS programming, I read Michael Abrash's legendary "Graphics Programming Black Book". Throughout the book, it refers specifically to VGA, mentioning EGA in only a handful of places, comparatively. I wanted to write a game that would run under EGA, though, so I started digging around for info specifically about it.

Eventually I found a really good article about EGA fundamentals in an old programming magazine on archive.org... written by Michael Abrash. And to my astonishment, it was basically word-for-word identical to one of the chapters in the book, except that in the book, every reference to "EGA" had been replaced with "VGA".

in reply to @cathoderaydude's post:

Yeah, I run into this exact scenario a lot, where I do something the hard, long, or expensive way first, and then an easier solution presents itself a few minutes to a few days later. I feel like it's cosmic irony that drives moments like these, especially when it's something like finding two EGA monitors within days of each other!

Ah, you’ve encountered Microvitec

As well as supplying Acorn they were basically the only game in town if you needed an Amiga monitor post commodore, to the point where Amiga Technologies OEM’d them. they command high prices and don’t show up that often

i got one in 2018 from Literally NATO. as in, it sat on a radar base connected to some mystery equipment though adapters i’ve never seen before or since, with NATO asset tags stuck on top of a part number-serial number label from some military supplier claiming Made In USA, which was in turn stuck over Microvitecā€˜s own label that said made in the uk

it was knackered! it had a cool problem that took forever to discover

you know how Classic macs have a grey desktop even though they’re 1 bit video? it’s just a one pixel checkerboard dither, and it made the Microvitec freak out. the more of that dithering was on the screen, the blurrier it got. i ran a mac emulator on it and had no idea what the fuck was happening. it didn’t happen with any other imagery and i never figured out why. it might even be inherent to the design

i wonder if it was some wacky shit like the bit pattern was leaking onto a sync line or coupling to a capacitor or whatever. i'm used to that nonsense from LCDs at this point but in a CRT that's weird as hell

microvitec is such an anomaly. "mid-80s UK electronics" is not supposed to mean "can be used as a wheel chock for an Abrams"