i stumbled on a github project earlier, an open source reimplementation of Commander Keen called omnispeak, and i was poking around in the issues page (you can learn all kinds of interesting things by looking at issues rather than the actual project) and saw one that instantly confused the hell out of me.
the issue concerns a minor "fix jerky motion" setting. the maintainer addresses it, lots of implementation details, yada yada - but the thing that intrigued me was this:
Since those EGA games must have been tied to a 70Hz display mode back in the day, I am not surprised that the timing in this engine tries to replicate that (I guess original games ran at 35FPS, since thats an exact divisor for 70. At least Keen 1 did, so...)
But Wait
EGA never ran at 70Hz. That's a VGA thing - in textmode, and in its 320x200x8bpp mode that was used for the majority of popular games, it did indeed run at 70hz, and this is why Doom, for instance, ran at a fixed 35fps. id knew it couldn't reliably hit that speed, and if they just let it run uncapped, it could wildly swing anywhere from 15-70fps on a midrange machine and become totally unplayable. instead, they capped it at a much safer 50% figure. this is common knowledge. but why would this apply to an EGA game??
so I asked, and the maintainer (sulix) was happy to answer: VGA's implementation of EGA modes is egregiously incorrect. even if you select the explicitly-EGA-compatible mode 0Dh (as keen does), the card will still render at 70hz.
this is astonishing to me. the EGA modes are supported by VGA for backwards compatibility, but IBM just went ahead and fundamentally changed how they're rendered. that's wild. and wouldn't this be a problem for timing? well: not in Keen.
sulix says that id developed Keen on VGA based machines - which makes sense to me, assuming my impressions of the era are correct. For instance, I have always assumed that there was a chicken and egg problem that kept a lot of PCs far, far behind the curve. VGA became available in 1987, but it took a bit for affordable clone cards to come out - and then, why would you buy one? for all the blistering-fast, eye-searing software available for it?
I don't think anyone was writing VGA games in 1988, or 89, or 90, or 91, because probably 95% of the market had EGA cards at best, and nobody was upgrading because there weren't great VGA games, and so on. there were LOTS of games released WELL into the 90s that had not just EGA support, but fucking CGA support, that's how bad the long tail was.
I assume that's largely why id designed Keen for EGA - to make it playable on all those outdated machines. at the same time, there's no fucking way anyone there had a machine without VGA by that point. so yeah, it makes sense that they would have developed this game, and all their other pre-Wolf3d titles, on machines that ran EGA video modes 16% faster than intended.
probably for this reason, they made the game's internal timing independent of the video subsystem; it actually regulates itself off the PC's programmable interval timer, as god intended. that means that if you play it on a machine with real EGA, all the physics etc. will work correctly, but you'll get a bit of visual judder as the game tries to fit 35fps worth of updates into effectively 30fps of output.
the thing is: EGA came out in 1984, and games that used its modes were produced well up into the 90s. somewhere in there, there must be games that do time themselves off of vblank, because from 84-87 there was no reason to think IBM was going to suddenly release a card that invalidated your assumptions about timing. so i'm very curious what this change broke, and whether IBM acknowledged it, and whether they offered any solutions.
also, this is like the fifth time in the last couple days that I've run into the fact that VGA doesn't actually support 320x200 or 640x200 resolutions on its analog output. the card can run at those modes internally, but they get linedoubled on output. thus, if you plug a VGA card into any monitor with an OSD, it's always going to report an "incorrect" 720x400@70hz mode, since that's genuinely what's being output.
p.s.: what's particularly odd about this last bit is that the IBM VGA monitor supported 60hz for the 640x480 mode. had it not been a double-sync display, this would all make sense to me - IBM just wanted to sell a single 70hz display and be done with it. but that's not the case. did running at 60hz bug them because it was flickery, so they did their best to minimize it?
well, maybe. from one of the VGA technical manuals:
The video subsystem supports attachment of 31.5 kHz horizontal sweep frequency direct drive analog displays. These displays have a vertical sweep frequency capability of 50 to 70 cycles per second, providing extended color and sharpness and reduced flicker in most
modes.
it honestly makes a lot of sense that IBM would just go "fuck any games that depended on this, we'd rather have as much software be flicker-free as possible."
p.p.s.: another possibility, which I'm having trouble nailing down, is that this was never a problem because nobody timed games that way. apparently EGA had no vertical blanking interrupt, so the only way to check for vblank was to poll for it. since the PC did have the PIT, which I think was not a common feature on home computers and consoles, maybe everything just used that the same way Keen did and the whole concept of timing stuff off of vblank just never existed on the PC.

