gdude2002

Be gay, do crime

Hobbyist educator, software developer and sysadmin, open-source community advocate, community manager

Avatar by slshaart on Twitter


cathoderaydude
@cathoderaydude

i stumbled on a github project earlier, an open source reimplementation of Commander Keen called omnispeak, and i was poking around in the issues page (you can learn all kinds of interesting things by looking at issues rather than the actual project) and saw one that instantly confused the hell out of me.


the issue concerns a minor "fix jerky motion" setting. the maintainer addresses it, lots of implementation details, yada yada - but the thing that intrigued me was this:

Since those EGA games must have been tied to a 70Hz display mode back in the day, I am not surprised that the timing in this engine tries to replicate that (I guess original games ran at 35FPS, since thats an exact divisor for 70. At least Keen 1 did, so...)

But Wait

EGA never ran at 70Hz. That's a VGA thing - in textmode, and in its 320x200x8bpp mode that was used for the majority of popular games, it did indeed run at 70hz, and this is why Doom, for instance, ran at a fixed 35fps. id knew it couldn't reliably hit that speed, and if they just let it run uncapped, it could wildly swing anywhere from 15-70fps on a midrange machine and become totally unplayable. instead, they capped it at a much safer 50% figure. this is common knowledge. but why would this apply to an EGA game??

so I asked, and the maintainer (sulix) was happy to answer: VGA's implementation of EGA modes is egregiously incorrect. even if you select the explicitly-EGA-compatible mode 0Dh (as keen does), the card will still render at 70hz.

this is astonishing to me. the EGA modes are supported by VGA for backwards compatibility, but IBM just went ahead and fundamentally changed how they're rendered. that's wild. and wouldn't this be a problem for timing? well: not in Keen.

sulix says that id developed Keen on VGA based machines - which makes sense to me, assuming my impressions of the era are correct. For instance, I have always assumed that there was a chicken and egg problem that kept a lot of PCs far, far behind the curve. VGA became available in 1987, but it took a bit for affordable clone cards to come out - and then, why would you buy one? for all the blistering-fast, eye-searing software available for it?

I don't think anyone was writing VGA games in 1988, or 89, or 90, or 91, because probably 95% of the market had EGA cards at best, and nobody was upgrading because there weren't great VGA games, and so on. there were LOTS of games released WELL into the 90s that had not just EGA support, but fucking CGA support, that's how bad the long tail was.

I assume that's largely why id designed Keen for EGA - to make it playable on all those outdated machines. at the same time, there's no fucking way anyone there had a machine without VGA by that point. so yeah, it makes sense that they would have developed this game, and all their other pre-Wolf3d titles, on machines that ran EGA video modes 16% faster than intended.

probably for this reason, they made the game's internal timing independent of the video subsystem; it actually regulates itself off the PC's programmable interval timer, as god intended. that means that if you play it on a machine with real EGA, all the physics etc. will work correctly, but you'll get a bit of visual judder as the game tries to fit 35fps worth of updates into effectively 30fps of output.

the thing is: EGA came out in 1984, and games that used its modes were produced well up into the 90s. somewhere in there, there must be games that do time themselves off of vblank, because from 84-87 there was no reason to think IBM was going to suddenly release a card that invalidated your assumptions about timing. so i'm very curious what this change broke, and whether IBM acknowledged it, and whether they offered any solutions.

also, this is like the fifth time in the last couple days that I've run into the fact that VGA doesn't actually support 320x200 or 640x200 resolutions on its analog output. the card can run at those modes internally, but they get linedoubled on output. thus, if you plug a VGA card into any monitor with an OSD, it's always going to report an "incorrect" 720x400@70hz mode, since that's genuinely what's being output.

p.s.: what's particularly odd about this last bit is that the IBM VGA monitor supported 60hz for the 640x480 mode. had it not been a double-sync display, this would all make sense to me - IBM just wanted to sell a single 70hz display and be done with it. but that's not the case. did running at 60hz bug them because it was flickery, so they did their best to minimize it?

well, maybe. from one of the VGA technical manuals:

The video subsystem supports attachment of 31.5 kHz horizontal sweep frequency direct drive analog displays. These displays have a vertical sweep frequency capability of 50 to 70 cycles per second, providing extended color and sharpness and reduced flicker in most
modes.

it honestly makes a lot of sense that IBM would just go "fuck any games that depended on this, we'd rather have as much software be flicker-free as possible."

p.p.s.: another possibility, which I'm having trouble nailing down, is that this was never a problem because nobody timed games that way. apparently EGA had no vertical blanking interrupt, so the only way to check for vblank was to poll for it. since the PC did have the PIT, which I think was not a common feature on home computers and consoles, maybe everything just used that the same way Keen did and the whole concept of timing stuff off of vblank just never existed on the PC.


You must log in to comment.

in reply to @cathoderaydude's post:

fwiw, I'm a bit hypersensitive to flickering lights, and I couldn't bear VGA on CRTs at less than 70hz. The move to lcd:s was a boon for me. (in movie theaters, if the movie is boring enough, I start noticing the flicker).

maybe everything just used that the same way Keen did and the whole concept of timing stuff off of vblank just never existed on the PC.

I wonder if that's why so many EGA-era games - well into the era where computers could be faster than 4.77 MHz - have so many problems with running far too fast on even a 386. If nobody'd ever thought of timing via vblank, and nobody had written their own clock-based timer instead...

I believe that the EGA resolutions run at 70hz on VGA chipsets because VGA's native 60hz video modes actually have square pixels. That's why VGA's "mode X" is 320x240 rather than EGA's 320x200, and high-resolution VGA is 640x480 rather than EGA's 640x400. IIRC, under the hood, they run through the same video signal generation circuit, running at the same speed - it's just that, for the EGA graphics modes, the signal is done 80 scanlines sooner. I guess they figured it was simpler to build a monitor that could deal with that than to conditionally slow down the signal generation on the board.

I think IBM cared much less about backwards compatibility than it did about control over the market, and VGA came out at a time when the PC standard had really started slipping out of IBM's control. Compaq came out with the first desktop 386 in 1986, and IBM's response in 1987 was to try to kill the ISA bus and try to make the PS/2 happen. VGA was part of that push! They only released an ISA VGA card as an upgrade option for their older machines, and what they really wanted to sell was the high-priced 8514/A add-on, which supported resolutions up to 1024x768 and provided hardware-accelerated drawing primitives. I can't imagine tweak to the EGA refresh rate potentially causing some games to run a little bit too fast would have been on their radar as an important issue.

I think you are correct that it was pretty rare for DOS games to rely on vblank for timing purposes. After a bit of research, it looks like the vblank interrupt was available on both EGA and VGA boards.... manufactured by IBM.

What appears to have happened is that VGA clone boards discovered that Windows didn't use it and things generally seemed to work perfectly fine if you just... didn't wire it up. Apparently a significant quantity of boards shipped with it disabled by default, with the option to enable it with a hardware jumper if you wanted. Even then, some boards just wired it up as a 60hz timer, with no relationship to the video timing at all. This was widespread enough that DOSBox apparently does not enable support for the vblank interrupt unless you specifically tell it to emulate an EGA machine.

(Source: this random usenet post from 1996.)