In this "fun"-filled article, we get to see a retrocomputing enthusiast:
- Try to find a UNIX workstation that doesn't cost a huge sum of money, and get lucky.
- Then try to find HP-UX install media, which HP doesn't have, so they have to ask on Fediverse to find people with the correct install media.
- Then try to run period correct software that was written for an earlier version of HP-UX (which runs thanks to backwards compatibility) because they can't find a version online tailored to later versions of HP-UX.
They failed to run a lot of software they specifically got for their shiny new HP-UX. This is because a lot of UNIX workstation software is/was proprietary, and required licenses to use. Even the demo licenses have expired long ago. And the still-existing vendors aren't interested in helping out enthusiasts and/or don't even have the tools generate licenses for their old versions anymore. Don't take my work for it; the author of the linked article tried to get licenses, and all the conversations fizzled out.
This article is a harsh lesson in how FlexLM and DRM are cancers, and companies treat their own old software like trash to be swept away. Software is being destroyed at an alarming rate due to negligence. Even if the software isn't commercially viable anymore and the hardware platforms are niche, I argue that hard work and energy put into creating the software is only truly lost when people can't1 run those old versions anymore.
I personally don't have nostalgia for UNIX workstations, and if I had one (too damn expensive :/), I'd run one of the FOSS BSDs on it b/c I enjoy running period-incorrect software on old machines. But I feel horrible for those UNIX workstation enthusiasts who don't share my aesthetic. They're having significant trouble getting their old machines to run the way they want, to enjoy computing on their own terms.
I got pretty angry reading this. Which I take as: the article is doing a good job. I don't usually see problems like this in the DOS world. I wonder why...
- As opposed to don't run those old versions anymore; I'm not certain don't happens before can't. If the software is available to play with, people will use it.
unfortunately, all deeply accurate. and it pains me to see--it's my belief that if you're going to collect old unix workstations, it's important to remember the context. these machines were very rarely expected to run on their own. they boot up with the expectation that they're going to be pulling resources from the network to get themselves going. but, okay, let's assume you've got a computer lying around that can feed it whatever it needs to get going.
now what? i ran into this when i picked up an old VAXstation on the cheap... DEC was acquired by Compaq, which was acquired by Hewlett-Packard, which split itself up into HP and Hewlett-Packard Enterprise, and we're now at the point where anyone who cared about DEC's products has, with a high degree of probability, left the company. probably to retire so they don't have to think about computers in a professional capacity ever again. (fates bless them for achieving freedom.) since basically nobody's left who cares, there's no interest in trying to make software and patches and documentation findable. HP's support sites were legendarily fragmented and bad even before the corporate split. good luck if you want anything from the pre-Oracle days at Sun.
i met someone recently who affectionately refers to his collection of vintage computing as e-waste. he's gone to great lengths to ensure that everything remains functional, and that they can boot up in something resembling the environment they expect to thrive in. but this requires effort, and frankly it requires knowing the right people to be able to get yr hands on software. the retrocomputing community makes it easier, of course, but some things are still only whispered about because the patents are still active, or even more improbably, the software's still in development. HP sold VMS, the old DEC operating system, off to a third-party company who's still developing and supporting it. but only for x86. the old hobbyist-license program that people were using to keep old VAXen running? dead, as far as i can tell. HPE shut their hobbyist program down when they transferred the rights to VMS, and the new company isn't offering them. it says it right there on the new sign-up page:
Please note that in accordance with the license agreement between VMS Software Inc. and HPE, VMS Software Inc. are not able to distribute VAX licenses.
so go fuck yrself, i guess. it breaks my heart a little that these elements of computing history are so thoroughly abandoned and lost. partly out of nostalgia, to be sure, whomst among us doesn't yearn for when things... at least, we thought they sucked less. in some objective ways, they did. but goddamn, it sucks to see old ideas get implemented in worse ways--or, even more annoyingly, old ideas get completely ignored in favour of some shiny bullshit that isn't even half-baked.
sic transit gloria, and all that shit. keep circulating the tapes.
To expand on what Liz is talking about, I think: One of the even bigger (literally) tragedies of retrocomputing is that it is mostly constrained to individual-scale machines.
The Living Computer Museum in Seattle closed during the pandemic. It seems unlikely that it will ever open again. While I had some Doubts about many of their curation choices, they did have an entire floor of interactive exhibits. Most of them were microcomputers, and it was very cool that you could use a Lisa, a Sun, a TI 99, a Tandy CoCo, and a NeXT in short order, with your own two hands, at your own pace. I know the preeminent collector of Xerox graphical workstations, and yet the first time I ever used one was at the LCM, not at his house.
However, what was even cooler was the Mainframe Room. A door at one end of the second floor led into a loud, cold room with raised flooring, in which a variety of machines from CDC, IBM, and others lived. There were terminals along one wall which would let you log into these systems, and in fact, you could even telnet into some of them from home.
You couldn't touch the hardware, there were big signs to this effect everywhere - but it didn't make sense to touch these machines, and this is germane to my point.
An IBM System/360 is not a "device," unless you also think we should apply that term to, say, a steel mill. Yes, iron comes in one end and steel I-beams come out the other, and it all works together, but undeniably it is a collection of many devices, gadgets, and machines that collectively accomplish a goal.
I don't know which is more complex: the infrastructure behind a steel mill, or that behind a mainframe computer, but I know that neither one can be casually reconstructed.
If you find a Sun anything lurking in a disused office at your university, there's a good chance you can bring it home, plug in an IEC cable, possibly a monitor you already own (or one you can get on eBay for $400 shipped), and boot it right up. Perhaps the hard drive dies, but you can see the firmware go, and maybe replace the HDD.
You won't find a System/360, and if you do, you will not be able to get it working. I have only ever heard of maybe a half dozen people under 60 who would even know where to start. They are extremely complex, extremely specific, and extremely big. They also did not get casually discarded or forgotten in offices - most of them were deliberately sent to the trash, because they were simply too massive to keep around once they weren't needed anymore.
There were countless incompatible revisions of everything. If you want a working mainframe, you will need to scour high and low, far and wide to bring together hundreds of parts, largely unlabeled except for line noise like "10-1058A." Even the cables are so obscure that you will have to build them, not buy them. Some of the connectors were custom.
Nothing will work. You will need to read schematics in manuals that are not on the internet, then do board repairs at the component level. You will need to test single transistors and know how to tell if one is "injured," because they will not necessarily be entirely dead, nor will it be practical to bulk-replace them. You will make hours-long repairs that do not fix the problem. You simply cannot tinker one of these back to life; you will need to become a genuine expert.
You will end up with at least one full 19" rack; probably several. I know someone whose entire garage is currently occupied by a single computer. I do not believe anyone else has one of these. There is no room to move around it. It does not work, and if it did, it would require a 480V AC supply at an amperage that would be challenging to obtain at most warehouses.
Once you've done all this work and you get the thing to boot up, congratulations: it's very, very boring. Even the people who are into these things struggle to make them do anything. I was going to add some adjectives to that sentence, but that's really it.
Among one of my retrocomputing friend groups, the joke is that mainframes were for printing invoices. This is basically accurate. Generally, you can't sit down at a mainframe and "open a program," and if you can, it's going to be unspeakably austere: a blank terminal with a blinking cursor and a couple meaningless numbers at the top and bottom of the screen, considered a UI masterpiece for its time.
Most of the programs are also gone - 99% of the software that ever existed for these systems was bespoke, never left the company that developed it, and even if you get it, you'll probably be in the "Disk 5 of 10" situation, where it depends on external systems that no longer exist. You might manage to get a hold of the tape labeled "LITECORP ACCNTNG 1980 V1.6", but you aren't going to get the blank database that was handcrafted when the thing was first written, without which it can't run.
And even if you got all that... it probably just prints invoices. That's what these were for. All your bank branches or insurance agencies send a tape once a month with the output from the minicomputer that runs the terminals at everyone's desks, and then a clerk reads each one into the machine, it slurps up all the customer records, and then a "chain printer" begins spewing (literally) bills that will later be stuffed into envelopes and mailed out.
Very probably, this system had no UI other than "INSERT TAPE TO READ." If the software barfed, it probably dropped a physical trouble ticket* and a "Systems Analyst" (midcentury term for "devops thought lord") would either stare at the raw database or launch a debugger for some horrifying sludge language like PL360.
* This is artistic license - I don't know that the physical "ticket drop" ever occurred outside of the phone company.
I have very little firsthand experience with these, I admit, and I'm conflating stories from other people. But this seems to be the consensus among everyone I've known who used this kind of gear.
Likewise, supercomputers - a word that has inspired awe in nerds for decades - are absolutely mind-numbingly uninteresting. A supercomputer is basically a device which accepts 10GB of integers, sits spinning its fans for two days, and then spits out "10.582338." Scientists hoot and holler; this means something to them, but to nobody else.
Basically: retrocomputing mostly orbits devices that are easy, convenient, and immediately satisfying to collect and restore. In much the same way that people collect cars, but not so much semi trucks or locomotives, which have to be preserved by Organizations, I don't really see much discussion of what is to become of Big Iron. And then there's stuff even further beyond that.
There's that whole Youtube series about restoring the Apollo computer - which I think tailed off in viewership massively after one or two eps, because it turns out that, yeah, you talk to it in line noise and it just spits out a bunch of red LED digits. you can't, exactly, "play" with it, or even "use" it for anything other than landing a spaceship.
It's good someone's fixing that up. But let's go even further: Who's preserving the bowling alley computers?
I've been thinking about this for literally decades. Something drives all those monitors over the lanes that show the scores, and then the skiier wiping out when you whiff it. Nowadays they might be dedicated devices, but we can be absolutely certain that in 1995, there wasn't a video playback unit in every single one - and in 1990, there wasn't even a computer in every one.
They had to be terminals. But what kind? Obviously they were graphical, and they had those little custom keyboards. There almost certainly wasn't one dedicated computer per lane to talk to those things. My guess is that the screens were "graphical dumb terminals", if you will, and in the back room there was a minicomputer with 50 serial ports, half to talk to the keyboards, half to send proprietary drawing commands to the displays.
And then the video clips? Where did those come from? My guess is: Bank of five or six laserdisc players that get switched through to a display via some obnoxious serial-controlled matrix switcher whenever a clip needs to play.
The input from the lane sensors has to be some nightmarish spiderweb of 22 gauge wires tied into gigantic bundles that run from the mechanics into the backroom and terminate into some horrific 256-lane GPIO module.
And nobody knows anything about all of this, as far as I can tell. I'm making shit up from whole cloth because there are zero webpages about it. And as old lanes go out of business and get demolished, or upgrade to newer gear, you can be sure the old stuff is just being goldscrapped.
So what's the point of this rambling post? This: We are losing more than we are saving. How should we feel about this? What should we do?
Well, you can try to Paul Allen (founder of the living computer museum) the problem away. Become rich, or inveigle yourself with people who are rich, and begin traveling the world, collecting every scrap of every single machine that has ever been made and packing it all into a series of warehouses.
But then, where should you stop? If your warehouse contains System/360s, shouldn't it contain bowling alley computers? Why not traffic light controllers? Avionics from mid-80s Boeing jets? What computer isn't worth preserving? You can either keep absolutely everything, or set an arbitrary line somewhere.
Keeping absolutely everything is problematic. I've been to the warehouses of people who did this. The machines sit. Nobody is using them. They are inconvenient to dig out, and in uncertain condition. No one has the energy, time, and resources to get them all working (not to mention the fact that much of "fixing" old machines is really borrowing from peter to pay paul; eventually we will run out of machines with donor parts.)
If you do pick an arbitrary cutoff for what to keep, well, you're probably going to have to go with "is it interesting / unique / relevant." Can anyone alive now relate to it? Can it inform us in some way about the past, or give us ideas about how to improve the present? And can nothing else provide the same stimulus?
The problem is that once you do this... you're probably going to throw out a lot of those Unix workstations. What makes them special? Often, not very much. They're mostly "crappy linux." They mostly had very similar commands and similar UI, and mostly ran software that was either very austere, or "version 1 of a program that was ported near-identically to Windows NT in version 3." I have used several of them - SGIs, Suns, HP Apollos, AT&T Unix PCs, to name a few - and even the people who knew these machines intimately shrugged when I asked "what can I actually do on here, what makes this special." They admitted readily that the answer was "nothing, it kind of sucks."
Of course, one option is to simply accept all this in a "cosmic truth" sense. Yes, we are only preserving a fraction of what was made - but in 50 years, it will very probably all be dead. Why waste our lives struggling to preserve a past that is simply decaying? Do we really get that much out of it?
shit like this is extremely sad, i like retrocomputing and would absolutely grab a terminal or two if i could afford one, but by far my biggest interest is in these very large systems that are impossible to get, and impossible to get and keep working.
with the obvious exception of the really old stuff with clacking relays or big visible tapes and perhaps the ambience these kinds of systems create. for me at least the interesting bit has never been in the actual executing of the program, but instead its in how these things were made and used. what did they do and why? these things were feats of their time but the parts that actually make them feats are almost never shown! even a museum cant really show you these kinds of things or you'd be looking at a single artifact all day
all this really does make me wonder though, what even makes these things interesting to y'all anyway? the overwhelming majority all these old systems id never even touched, probably never will and were built or even decommissioned before i was even born, and yet, they still captivate. why?
sad to hear about the lcm shutting its doors, its been a bucket list destination and is pretty much the only reason i had for seriously considering visiting the us again
this is quite possibly the most depressing thread ive read so far on cohost, i want to go and against argue some point, but there isnt a single point by any of you that is even slightly wrong. that said, if anyone actually does have any old avionics they dont want any more, i'd love to have a piece or two.
what even makes these things interesting to y'all anyway?
You just activated my Trap Card!
The conclusion I came to nearly ten years ago is that we don't want an SGI workstation, we want to need an SGI workstation, for our job, in its heydey.
An excerpt from a script I never got around to producing:
Suns and SGIs are considered cool by a lot of retrocomputing people, but if you ask me, they got that status entirely because they were once part of something cool - they were making 3d graphics and serving webpages at a time when those were extremely hard things to do, and so the people working on them were making history. Most retrocomputing people are probably unaware that that's why these were trumpeted as unique and special for decades, and that's why it's not often acknowledged that they really aren't very different from ordinary desktop PCs that would come out just a few years later.
SGIs did not produce remarkable 3D graphics - they did the same stuff your PC could do in 2003, just a decade earlier. I would guess that you don't really want to own a machine that has almost no software other than a couple of 3D packages, with which it can make graphics on par with your PC 18 years ago - software packages that were all ported to Windows NT, virtually unchanged, a few years later when x86 caught up.
What you probably want a lot more than that is to have Been There, in 1993, pounding Jolt Colas and making the dinosaurs in Jurassic Park. And the part of that that happens on the computer isn't that special. You could do the same thing on your PC now, but you would need to have 3D modeling and animation skills, which you likely don't. And for it to mean anything, you would need a project - a director, a script, and so on. By itself, the machine just makes polygons, and not very good ones by the standards of just a few years later.
There are many differences of course between those and [the video mixer I was covering in this script,] but like the SGI workstation, it wants to do a job that it can't do anymore.
This machine wants to make television. To see it do that, you need to have the skills to use it - but you also need the television. The actual show, the script, the director, the actors, the cameras, the sets. I don't have those. So I can show you what it can do, but I'm not showing you what it did, and to me that's a huge bummer. This machine was sent to me by someone who once used it to make actual television, and I wish I could make it do that.
I am not saying, by this, that you literally want to be a VFX artist in 1993. Perhaps you have no artistic inclination - neither do I. There wasn't a divergence point in my life where I could have become a VFX artist, nor do I wish that was the life I was living.
What I'm saying is that these machines were There. They were present in a remarkable time in history, doing something then-incredible, and that emits an irresistible energy to a particular sort of mind, imo, ime.
We do not wish to be writers, but we wish to have written a book. To have been there, in the thick of it, to have the war stories - or to be creating those war stories now. Because, after all, how many of us like our jobs? How many of us feel like we're doing something remarkable, something memorable?
Some of the replies within this very choststream address the fact that a tremendous number of the people in this greater demographic are tech workers who are doing nothing of consequence, and know it, and hate it. Many of us know our work will be literally thrown in the trash at the whims of our employer, and in the meantime, it will mean very little to anyone.
It is a tragedy of the modern world that so many talented people have to make a living by helping the rich get richer, rather than contributing to the greater good of humanity, or even just indulging their own interests. But wouldn't it be incredible to have been there? For... anything? To be present at the launch of something historical? To work on a project that's a decade ahead of its time?
Or... just to do something that's not easy. Not in a "effortless" or "unskilled" sense, but in the sense that it's not a foregone conclusion. Sure, writing complex software or doing SRE or whatever can be hard, but it's going to get done one way or another. If not by you, then by someone else. Perhaps worse, but still functionally. Any inefficiencies will be overcome with additional hardware; any inaccuracies will be overcome by not giving a shit.
But as you go back, you pass through eras where many tasks were accomplishments just by nature of having happened at all.
Those SGI workstations made Jurassic Park. They made Babylon 5 (I think.) They made Star Trek: Voyager. They made Toy Story (although I think they rendered it on NT boxes. do not quote this.) Even if none of us have aspirations of making 3D movies, we still recognize that these were incredible events to be present for. Even if the art was going into some forgettable trash fire of a movie, the wizards running the machines were performing the impossible, and the machines were their spellbooks.
I don't mean to speak for everyone. I am making an example which I think a lot of people will understand. These are specifics, but I think in general, we find it appealing to imagine a Computing Experience - something we all have a connection to already - that was, at it's time, not Ordinary.
Using a Mac in 1984 was wild. Using a TI 99 in 1979 was wild. Using a PC in 1981 was wild. Using an SGI workstation when those were relevant was wild. And using a mainframe in 1968 was wild.
Not all the questions were answered. Not all the possibilities were known. Not all the impossibilities were known. Some things couldn't be done! Oh, there are still things that can't be done, but they're big. They're mathematically- or logically-provable impossibilities or impracticalities, like the post from a couple weeks ago about how every function you can request from a database is either impossible or slow.
If you were an Analyst in 1968, and you wanted to calculate some figures on an IBM mainframe, derived from millions of statistical records, and then produce a graph to be displayed at a conference, you had to:
-
Get them into the machine. How? Punchcards, generally. Someone is going to have to key all those cards. That's a ton of man-hours, can it be reduced? There might be entry errors. The cards might read wrong. How can you prepare for these possibilities?
-
Process the data. You're sharing the machine with other users, your task requires human-scale amounts of time that could range from minutes to hours, and there is limited multitasking (on some machines: none whatsoever.) You can't just tie up the machine indefinitely, so you need to work hard to make your software efficient, or it might not be done in time for the conference.
-
Output the results. Probably to a line printer, at which point someone needs to take the figures, manually plot a graph on a drafting table, and then photograph it to make slides.
At every step of the process, things could explode, in some cases literally. There is no "ordinary" in this job. You need to be attached to the project all the way through, babying it, making sure all goes well, ready for anything, and when you're done, you know that actually finishing this work was not a guaranteed outcome.
This is an exciting job. Ask people who worked at the phone company about their job - were they passionate about phones? Did they really care about communication? Maybe, but they were too busy to care. Busy with new tasks, unknown challenges, new problems, all very concrete. Not made-up bullshit, not busywork, not solved problems unsolving themselves for no reason. When a phone switch (or a mainframe) broke, it wasn't because of malfeasance, it's because making things was hard. They were living at the cutting edge.
I believe that this is at the root of much of our fascination. We live in an era of solved problems. The low-hanging and medium-distance-hanging fruit is entirely picked. If computers are your job or your hobby, you're probably going to spend most of your time reinventing the wheel. If you're doing anything really novel, anything with an uncertain outcome, you're probably working at a level so high that the overwhelming majority of people can't really relate to it. But there was a time when simply Being On The Computer was, in itself, a remarkable feat.
There are many generalizations here, of course, but this is what I feel is going on, based on ~20 years of observations and personal reflection.
hey super quick sidenote is 1700 words
the TL;DR of my addition is: I once had an SGI Octane that would've cost $30,000 in 1997. i ran SGI's graphics framework demo on it and liked it. i connected to it with X11 and my mac laptop that was $2,799 in 2002 ran absolute rings around it in graphical performance. just, absolutely shameful trouncing.
and a couple other notes.
The stories of SGI machines here kinda tracks with what nearly happened to Apple themselves with the PowerPC line.
There's a tendency to think of tech advancement as linear with the occasional discrete jump, but the timelines really sort of ... overlap, quite a lot.
There's a lag often, where an incumbent technology has enough momentum to coast on legacy buy-in, and so many of the great falls of computer history are a matter of a company still thinking they're queen of the roost, and not realizing that this long-tail is where they actually are and the market just hasn't figured it out yet.
As @coryw said, this is what happened to a lot of the big "workstation" companies. The reputation of "big power" suddenly came to a screeching halt as customers realized that x86 had caught up well enough they could stop paying $20,000 for a file server. Some of them flailed spectacularly throwing good money after bad trying to catch up, and most of those are now weird entries in history books about architectures that never took off.
Apple, after so much hype about the G3 and G4 lines supposed technical superiority, was on the precipice of that same mistake by the time of the embarassingly awful G5. Overpriced, underpowered, hot enough to burn down a barn. For whatever advantage they ever had over x86, they'd lost it again, and as much as some of us loyalists hated it ... moving to Intel was the way to go here.
I have felt for some time now that Intel is essentially on that same ledge: far into the long-tail but coasting on legacy buy-in on a truly epic scale ... x86 chips just suck. They're huge, slow, hot, power-hungry, and burdened with so much legacy guff that there's really only so much can be done. I've seen it coming since I first used my old ASUS Transformer Prime, but back then the software wasn't ready yet by a long shot
This time Apple have been the ones quietly investing in ARM, and now they have a laptop that makes every Intel/x86 laptop I've ever owned look like a fucking dinosaur. I hate that. I hate that it's true. But I got my first M1 machine for work, and once you have that in your hands, once you have a laptop that makes no sound, no heat, runs for days, and all somehow without being any slower than the equivalent Intel rig ... suddenly my new gamer PC feels like that SGI probably did in 2002.
The software hasn't quite caught up to it, and for certain graphics and gaming applications I suspect it may be a revision or two before it reaches full parity still, but for most users and consumers that aren't weird gamer nerds like me? Just buy the fucking ARM chip.
The only reason Intel isn't completely screwed right now is Apple is keeping their chips to themselves, and will never dare charge less for anything they do, and that leaves Qualcomm for the rest of us, and they are a fucking clown show.

