cathoderaydude
@cathoderaydude

i bought this thinking it could output OG-accurate and/or scaled analog video and For Values Of "Can't" That I Don't Wish To Debate, haha, no, it can't. it's Disappointment Week here at the ol gravis house

i will expand on this later


cathoderaydude
@cathoderaydude

meandering rambling that's probably mostly wrong


i am in no way mad at anyone here, to clarify, because this is a mix of misunderstandings on my part and "oh i guess that's how that would have to work" and open source project with very few maintainers syndrome. i'm just writing this so you understand why i'm disappointed.

what i had in my head when i bought this thing was that it would be Basically On Par With Emulators, Except, and the Except was a big question mark space that vaguely said things like "hardware accuracy in ways emulation can't achieve." that's a really broad target to hit though.

primarily video hardware is my Concern Area, and there were basically two notions in my head:

  • that MiSTer cores all had functional replicas of original hardware video circuitry. if you loaded an NES core it would have a PPU that output raw NTSC video the exact same way the NES did - perhaps not using identical circuitry, but still in the form of a RAMDAC that converts pixel values directly to NTSC color phases with no intermediate stages, so you'd get a more or less perfect replica of the signal from a real NES, quirks and all, happening at a "racing the beam" rate.

  • if a core doesn't have good analog output circuitry, or you don't have a display that would work well with the simulated machine, you would be able to coerce it into a desired output format with some tweaking

these are... true enough, in many cases, but not as true as i'd like, and in some cases not true at all.

for instance, if you load a ZX Spectrum core, it'll spit PAL out the VGA port, and it'll put a picture on an analog TV screen, but it's just being coerced from the same digital version of the image that's being sent to the HDMI output, going through a generic composite video generator copied from some other core. now, this shouldn't... matter, in any way, right? i mean the spectrum generates an analog image from a framebuffer. that's the same thing. if the core outputs a composite signal that looks right on a TV then who cares if it was generated with a netlist identical to an original speccy ULA?

well, okay, i can't really argue with that except that... i thought that's what i was getting, a thing that works that way. like, it's already pretty hard to explain what function the mister serves, with emulation being so good these days, and if it's just a generic framebuffer with a mindless svid/composite/rgb converter then this all feels like one big leap closer to "a raspberry pi with a shitty 3.5mm TRRS hanging out of it." i thought the whole point was "we replicated all the hardware, warts and all" but that's just overwhelmingly not the case from what i can tell - most stuff is reimplemented in ways that don't bear any resemblance to the original hardware internally, and in some cases even externally.

case in point, and an actual complaint of mine, a thing that literally does not work correctly, defeating the purpose completely, is the Tandy CoCo core. the CoCo used an Apple ][ style "artifact video" scheme, leveraging weird timing bullshit in NTSC video. with an original CoCo, if you plug it into a B&W TV, you see ugly black and white stripes on the screen, because that's how artifact color worked: you output nasty B&W bit patterns, and the edges land on carefully-tuned spots that cause a color TV to interpret them as color signals. It's very limited, you only get two ugly hues (blue and orange) but it's enough for a lot of neat games to look passable.

on the mister coco core, if you plug in a color composite TV you'll get... black and white stripes. because the core generates those internally, and feeds the raw bitmap to the generic composite converter, which does the "right" thing: it turns them into a perfectly clean and crisp picture of black and white stripes. this module was designed to output "correct" NTSC, so it ruins the CoCo, which depends on wrong NTSC.

you want to see the colors? no problem - go into the OSD and turn on NTSC emulation. the core will then interpret the bit patterns and convert them into colored pixels, which the scaler then converts into the crispest, cleanest picture of those colors you've ever seen. not the smeared confusion of the original CoCo, but something that looks damn near like RGB. it looks like an emulator, in other words.

this is disappointing because it's one of the things i specifically wanted from the MiSTer, which its uniquely suited to: you cannot trick a (fucking) raspberry pi into outputting artifact video, or if you can, it's never going to be accurate. it's always going to be too clean, or it's going to be smushing pixels together in ways that the real hardware never did, but it's never going to be right. I want something that's EXACTLY as bad as the original hardware, but not worse in any of the 2020s ways that emulators (and the fucking raspberry pi) usually are. but that would have required someone to implement the CoCo's video hardware in verilog, and here's the dark secret I've learned: almost nobody is working on the mister.

oh sure, there's the amazing work with all the arcade cores, and a couple headliner consoles, but step outside of the popular platforms and it's a ghost town. most of the cores for home computers and less-common consoles, as far as I can tell, have been sitting in alpha "hey it technically works" status for 6-8 months, which in many cases is nearly as long as they've existed. a lot are ports from other, standalone FPGA projects which have just been adapted to fit the MiSTer and that's it.

making things worse, in a lot of cases there are multiple cores. i have two different IBM PC cores, two different MSXes, two ZX Spectrums, and even two different CoCos. in most cases the github pages for these are incredibly sparse. the coco3 core simply says it's a port of some guy's tandy-on-a-chip project, and tells you where to put your boot roms. that's it. i had to figure out for myself that it doesn't do the analog video correctly because there's no mention of it, and that's probably because the original project likely only output HDMI; the composite support was part of the greater mister project and they just plugged it in.

i'm pretty sure the coco core is very new; maybe someday someone will add proper NTSC output to it. i sure can't do it though, and since 95% of the issues i've seen open on any mister core are being addressed by the one guy who runs the whole project, i don't think it'll happen soon. i feel like probably these platforms are not priorities for him.

this is just one of many, many, many complaints i have about many cores that are clearly "someone just hasn't gotten to that yet" issues, I can't cite others because there's so many I've lost track, but I'll say this much: if you're used to emulators, almost all of which have 10-30 years of effort put into them at this point, then buckle up because you're gonna miss a lot. for instance, almost nothing supports save states or "fast forward." NES, SNES, Genesis, maybe PSX have save states, I think that's about it. has save states, nothing else does. Some projects have even said there are solid technical reasons that they will never do save states.

likewise for "fast forward", where you hold a button to make game go faster to skip cutscenes etc. - very easy to do in emulation, but on an FPGA you need to make all the hardware actually go faster, and that's MUCH harder to do on the fly. We're talking, "switching the frequencies on 6 different PLLs feeding different subprocessors, whoops, one of them was the sound chip and now it's desynced, oh no, oh no." so it's just... never going to happen. it's not a matter of time and effort.

so, okay, video issues, maturity issues, feature parity with emulation... and then there's the scaler thing. and honestly i don't know what i expected here but it's not like I'm not blaming myself? i'm just explaining my feelings and idk maybe you have similar misconceptions.

i guess what I had in my head was that the video portion of each core would be abstracted in some way so you'd be able to adapt to whatever display device you have cleanly, whatever that means. this is obviously the retrogaming holy grail problem - i own no fewer than a dozen devices that purport to scale video and it's an eternal, miserable bitch of a problem that just doesn't get better because, in many ways, it can't. this is just a thing you shouldn't do, especially with signals intended for analog displays.

now, in one case i had extremely poorly set expectations - i had hoped to use a PC88 on a PVM. as i wrote about a month ago, this is not possible; the PC88/98 output 400-line video that absolutely cannot be displayed on anything but a proprietary NEC display, there is no meaningful way to convert it to anything else. "oops." like, this is entirely my fault and cannot be fixed, but... it IS still a major thing I had hoped for which turns out to be infeasible. to its credit however, it WILL display cleanly on a VGA monitor, and since the PC88 was a native RGB device it basically looks perfect, so ultimately this DID end up doing what i wanted.

see, i love emulation. i think it's 99% of the way there for almost every platform and it's amazing that it's come so far. i am VERY happy with emulation and since i don't play fight games i don't care about latency issues. but the one thing it can't do is output analog video - i have PC88 emulators, but I'm stuck looking at them in a window, and given the Energy of that particular platform, it really deserves to be on a CRT.

i do not want to keep a PC88 and special monitor around just to achieve this, so, i was hoping the Mister would let me have a more hardware-esque PC88 experience without a custom display, and... it did! I use a VGA monitor and it looks outstanding, probably about like an original one, though I'd definitely like to have the chance to test that assumption. so honestly that's kind of a justification for the mister on its own. i mean, i paid under $500 for this - you can't get a PC88 let alone a display for half that. so, i guess... i won?

but then there's stuff like the C64 and ZX Spectrum. the big problem with using those on CRTs is the phosphor situation - as an american, i cannot readily obtain a CRT that does not have horrific, eye-searing flicker at 50hz, and running these platforms at NTSC speeds is not an option. C64 games will all run too fast, and Spectrums just didn't come in NTSC. emulators and mister cores don't even offer the option; it's 50hz or go home.

well, i had heard the mister had a very flexible scaler for its VGA output, and I had seen VGA output looking very crisp in photos, so I thought, great, I'll have the scaler step it up to 100hz. i have a VGA monitor that can do 160hz at 640x480 so that'll work great.

okay, well, it turns out that it's much more complicated than that. the mister has two options for its VGA output: the scandoubler, and the scaler. anyone who works with retrogaming etc is familiar with this distinction - the scandoubler does exactly what it says on the tin, it scans every line twice, doubling the resolution and timing while achieving a perfect 2x integer scale. the scaler is a more arbitrary device which operates in the digital domain and can achieve much more varied output, but rarely has... exactly the results you'd hope for.

the scaler is nearest-neighbor, which is, you know, what you'd want. you don't want bilinear filtering here; ugh, shudder. the problem is that nearest-neighbor can only cleanly scale at perfect integer values so if you're targeting an in-between, "1.5x" scale, you get weird lumpy pixels as a result. this cannot be fixed.

so, the problem is that these are seperate modules. the scaler is sure as shit capable of outputting any res you want - it'll do 320x240@240hz if you like, and there's a clever way of configuring it like this to get 240p video (like from an NES) with nice thick scanlines that are actually scanlines, they are places where the beam didn't scan at all. but it'll ONLY do its scaling with the nearest-neighbor algorithm, so the results look like dog shit. and I can put the Spectrum and C64 cores through this to force them to 60hz or even 100hz to deal with the flicker, but they also come out looking like dog shit.

the scandoubler on the other hand delivers perfectly crisp, accurate scaling... at 50hz. so my VGA monitor flickers. and the thing is, i'm sure this isn't a true scandoubler; it's outputting 640x480, which is not a multiple of the Spectrum resolution, so i'm SURE it's doing some kind of internal scaling, and thus it could probably be adjusted to scan quadruple, producing 100hz, but... if it's possible, nobody has done it, so this still doesn't solve my problem: i cannot get a zx spectrum on a glass tube that doesn't look like a strobe light.

it MIGHT be possible to tune the scaler with special "modelines" to get the pixel clock to match better but those are incredibly complex and i have no idea what i'm doing with them, and it's also just a bummer because... this is exactly what you spend all your time doing with an ordinary scaler. i might as well just bring my Spectrum home, we're right back where we started.

i mean, i'm being unfair. my real spectrum can't fastload tapes, ffs, and it, too, outputs 50hz, so - if all this is is "a pretty accurate zx spectrum that will output the same analog image but can quickload tape images from SD", that beats the pants off of any alternative I have. and i know these problems are hard to fix, just... idk. i guess i thought since the mister had intimate access to the framebuffer, and the innards of the cores, that it could synthesize some better solution. instead i'm scraping google results for "mister fpga 'spectrum' 'modeline'" and getting nothing.

so... yeah, idk. it does stuff, it just doesn't do what i hoped. the console implementations are much better tbh but i don't care about that - i own almost every console it can simulate, and I've already seen 98% of the games that ever came out for them. home computers were really the focus when I bought this thing, and unfortunately those cores just aren't where I wanted them to be. oh well.


You must log in to comment.

in reply to @cathoderaydude's post:

it's not really been that bad haha, i'm just like, kinda bored rn because i can't work so i'm Bothering to complain about things. and in some cases i'm doing things that i would not normally do, so i would not normally have the opportunity to be disappointed. but it's minor crap

so, I have the FPGA unit for one of these, but none of the extra hardware, and frankly I have no idea how I'd use it for MiSTer things, because by a matter of accident I happen to be using it for the FPGA development side of it a lot more than as a part in a vintage gaming rig lol

in reply to @cathoderaydude's post:

Have you tried experimenting with direct video at all? Trying to find an appropriate adapter/encoder for the HDMI out is laughs like a dude who just got kicked out of a bank but if you had the hardware to get it working might actually match up with the kind of thing you're looking for.

As for me I appreciate that it's as close to cycle-accurate as you're going to get by virtue of the way the platform works. I accept that many shortcuts the average emulator makes are entirely justified but it's a much better development target than that. As a guy who is very persnickety about Genesis audio, I was quite pleased with how the MD core produces sound.

i have not and maybe that's because i didn't understand it? like perhaps direct video is where all this functionality lives and it IS implemented and i just don't know it. otoh it also smelled like a feature that only the most popular cores had a chance in hell of having

I'm a huge MiSTer fan, but all your arguments are extremely correct and valid. I treat it as an extremely accurate 8/16-bit console box that obviates my need to have an NES, SNES, PC Engine, etc. all hooked up a TV (my space is limited). Once you get outside of those and a handful of other cores that have seen a lot of love (Amiga comes to mind), things get rickety very fast.

I suspect (i.e. talking out of my ass) that most of what you want is possible with sufficient effort on the part of the core maintainers, but it really takes someone with very specific EE skills and very specific special interests to get cores to that level.

Yeah, I remember messing around with the bally astrocade core and that thing isn't really quite built with the modern expectation of one player one controller as some games have you doubling up on them (go figure, they were one of those early ones that didn't have buttons, plural). I'm quite impressed at what it is but the more, uh, archival system implementation is definitely spotty.

So you overestimated the savestate support. The newest mainstream console with them is the NES. No genesis or snes and definitely not psx.

The one possible consolation is that it is possible for a core to Do Analog Video Accurately in the future. But something something we'd all be running the gnu hurd

Yeah, savestates in hardware are actually extremely difficult from my understanding. It requires you to have latches on every single bit of state in the system, because you need to be able to reliably read from and write to all of them at once. It's a lot of work to add that to your HDL, and it can take up a lot of space on the FPGA. It's one of those things that's very easy to do in software emulation, but very hard in hardware.

sorry to burst your bubble but afaict (N=2) european crt tvs do not, in fact, have longer-persistance phosphors (or if they do, they're certainly not enough to mitigate the flickering; maybe they're better than ntsc sets but i couldn't tell you). i guess ppl just put up with it back then until 100 hz sets became available

man this is staggering if true. i... i literally cannot comprehend whole nations putting up with this. it's violently painful on all the screens I've ever tried it on. i assume n=2 means you have a couple specimens? are they pro sets or consumer?

consumer, one from a local brand from what feels like the late 80s, and a late 90s trinitron (kv-16wt1), both highly effective headache machines at 50 hz ;v;

i use the trinitron as a third display for my pc and thankfully running it at 60 hz (which most pal sets do support, thankfully) while wearing headphones makes it actually usable. sucks if the stuff you plug into it only outputs 50 hz tho

here's the thing. you just get used to it.

like, i went though several years of not owning a CRT of any kind. then i saw one and I was like "ow fuck, my brain"

then i started collecting them and my eyes adjusted in 2 days and i couldn't perceive it anymore

"but then there's stuff like the C64 and ZX Spectrum. the big problem with using those on CRTs is the phosphor situation - as an american, i cannot readily obtain a CRT that does not have horrific, eye-searing flicker at 50hz"

oh don't worry, we didn't have any longer persistance phosphor than you

we just embraced the flicker

this is so hard to believe. like obviously i am not calling you a liar but... but... really?? really??? i don't get it??? how did anyone tolerate it?? it's below the human motion threshold! they just DID this?? and everyone just put up with it??