Osmose

I make websites and chiptunes!

  • he/him

AKAs:
Lapsed Neurotypical
JavaScript's Strongest Warrior
Fake Podcast Host
Thotleader
Vertically Integrated Boyfriend
Your Fave's Mutual
Certified 5x Severals by the RIAA
Inconsistently Medicated
The Source That Views Back
Carnally Known
The Alternative


Homepage
osmose.ceo/

I saw a short post about fundamental HCI principles being ignored by modern software. I've seen similar sentiments before.

It confuses me. It's not at all how I remember computers. At least computers as far back as the mid-90s.

Every Windows machine I had until the full antivirus edition of Windows Defender broke due to (presumably, you never really could tell for sure) a virus and had to be reinstalled at least once in its lifetime. Old desktop software often broke and couldn't be run anymore because it wasn't being updated. Java screamed at me for updates constantly. We used to wiggle the mouse to try and get websites to load faster. Getting sound to work consistently in games was often a dice roll. PDFs used to contain viruses! And we solved that one with fuckin JavaScript!!

I often feel like arguments about the decline of modern software usability take a very narrow view of what usability is. As if raised borders on a button are more important than not crashing your entire computer? The computers I remember had tons of issues that more or less don't happen anymore. We've picked up new issues along the way, sure, but like my computers haven't hard-crashed in a very long time, and websites from 15 years ago are much more likely to work in modern browsers than applications from 15 years ago are on a modern OS. I genuinely think the totality of user experience has improved significantly.

I do agree that older systems were less complex and more learnable than the systems of today, but I don't think many people actually want to learn these systems. The dream of learning the patterns of your operating system and then being proficient in most software that followed those patterns was never really a thing once GUIs came around—even back then the UIs were complex enough to need learning even if they used OS-default buttons. And people would rather learn their favorite websites or programs and get on with their day, because their goal is not to be good at computers. Their goal is to install Minecraft mods, make good Tumblr memes, record a video in Garry's mod of Shadow the Hedgehog defeating the skibidi toilets, etc.

I'm curious if other people have a similar view of what computing in the past was like, or if their own experiences bear out this image of the usable past.


You must log in to comment.

in reply to @Osmose's post:

Every Windows machine I had until the full antivirus edition of Windows Defender broke due to (presumably, you never really could tell for sure) a virus and had to be reinstalled at least once in its lifetime.

I first thought this would reference a godawful third party antivirus breaking the install, which seemed just as common haha.

It feels like a lot of people who learned how to computer out of necessity and enjoyed it assume that would be universal. But most people arguably would take one look at AUTOEXEC.BAT and CONFIG.SYS and decide to spend their time on almost anything else.

I think it peaked a few (maybe 10? plague time has ruined my sense too specific) years ago, is really what it is. We got most of the security problems like that out of the way at the tail end of the Windows 7 era, and things have been pretty messy since. Lots of steps forward and back simultaneously, and lately, it's almost entirely been steps back. There was the old commentary that Microsoft started entirely updating the OS by way of telemetry metrics, but anyone who knew how to use any of the stuff they started removing ALSO was the sort of person who'd disable telemetry first.

Anyway, more to what you asked...

At least on Windows, crashes basically vanished as of Vista. The driver model rewrite that came with it meant that any machine that should have been running it (had enough RAM, didn't have a Creative soundcard) was much better off doing so. I had an old desktop with failing hardware I couldn't afford to replace that was completely saved by that, since it just ended up crashing a driver and salvaging itself after about 20 seconds. Compatibility also improved dramatically (on both the Windows and Linux sides), and iirc that's about when Wine finally stopped being a complete piece of shit for most things (not to say it doesn't have rough edges). Sound started being consistent, the moment Creative's monopoly on working sound on Windows broke / PulseAudio came around (god I hate to give it credit but...), etc.

worth putting it out there that Windows was always the worst offender for instability and insecurity (especially Win95+), and there were still several viable alternatives (both for home use, and stuff you'd see at work or school) in the 90s and far more in the 80s.

I used OS/2 at home from '94 to the early aughts and in that period going from sometimes using Win3 in other contexts, to sometimes using Win95/98/ME, was like going from taking occasional involuntary rides in a 300k-mile no-features Volvo to rides in a shiny new Cybertruck. they sure were proud of hiring an aesthetic designer[citation needed], but an absolute cliff for my day-to-day expectations of things working consistently/at all. (in this metaphor OS/2 was a Subaru, albeit mainly for muh lesbian memes) (Mac was like a BMW or Audi or something) (BeOS was probably an Infiniti) (help I'm stuck in the bit, please someone assign cars to Amiga and NeXTSTEP so I can stop)