pendell

Current Hyperfixation: Wizard of Oz

  • He/Him

I use outdated technology just for fun, listen to crappy music, and watch a lot of horror movies. Expect posts about These Things. I talk a lot.

Check tags like Star Trek Archive and Media Piracy to find things I share for others.



cr1901
@cr1901

In this "fun"-filled article, we get to see a retrocomputing enthusiast:

  1. Try to find a UNIX workstation that doesn't cost a huge sum of money, and get lucky.
  2. Then try to find HP-UX install media, which HP doesn't have, so they have to ask on Fediverse to find people with the correct install media.
  3. Then try to run period correct software that was written for an earlier version of HP-UX (which runs thanks to backwards compatibility) because they can't find a version online tailored to later versions of HP-UX.

They failed to run a lot of software they specifically got for their shiny new HP-UX. This is because a lot of UNIX workstation software is/was proprietary, and required licenses to use. Even the demo licenses have expired long ago. And the still-existing vendors aren't interested in helping out enthusiasts and/or don't even have the tools generate licenses for their old versions anymore. Don't take my work for it; the author of the linked article tried to get licenses, and all the conversations fizzled out.

This article is a harsh lesson in how FlexLM and DRM are cancers, and companies treat their own old software like trash to be swept away. Software is being destroyed at an alarming rate due to negligence. Even if the software isn't commercially viable anymore and the hardware platforms are niche, I argue that hard work and energy put into creating the software is only truly lost when people can't1 run those old versions anymore.

I personally don't have nostalgia for UNIX workstations, and if I had one (too damn expensive :/), I'd run one of the FOSS BSDs on it b/c I enjoy running period-incorrect software on old machines. But I feel horrible for those UNIX workstation enthusiasts who don't share my aesthetic. They're having significant trouble getting their old machines to run the way they want, to enjoy computing on their own terms.

I got pretty angry reading this. Which I take as: the article is doing a good job. I don't usually see problems like this in the DOS world. I wonder why...

  1. As opposed to don't run those old versions anymore; I'm not certain don't happens before can't. If the software is available to play with, people will use it.

selectric
@selectric

unfortunately, all deeply accurate. and it pains me to see--it's my belief that if you're going to collect old unix workstations, it's important to remember the context. these machines were very rarely expected to run on their own. they boot up with the expectation that they're going to be pulling resources from the network to get themselves going. but, okay, let's assume you've got a computer lying around that can feed it whatever it needs to get going.

now what? i ran into this when i picked up an old VAXstation on the cheap... DEC was acquired by Compaq, which was acquired by Hewlett-Packard, which split itself up into HP and Hewlett-Packard Enterprise, and we're now at the point where anyone who cared about DEC's products has, with a high degree of probability, left the company. probably to retire so they don't have to think about computers in a professional capacity ever again. (fates bless them for achieving freedom.) since basically nobody's left who cares, there's no interest in trying to make software and patches and documentation findable. HP's support sites were legendarily fragmented and bad even before the corporate split. good luck if you want anything from the pre-Oracle days at Sun.

i met someone recently who affectionately refers to his collection of vintage computing as e-waste. he's gone to great lengths to ensure that everything remains functional, and that they can boot up in something resembling the environment they expect to thrive in. but this requires effort, and frankly it requires knowing the right people to be able to get yr hands on software. the retrocomputing community makes it easier, of course, but some things are still only whispered about because the patents are still active, or even more improbably, the software's still in development. HP sold VMS, the old DEC operating system, off to a third-party company who's still developing and supporting it. but only for x86. the old hobbyist-license program that people were using to keep old VAXen running? dead, as far as i can tell. HPE shut their hobbyist program down when they transferred the rights to VMS, and the new company isn't offering them. it says it right there on the new sign-up page:

Please note that in accordance with the license agreement between VMS Software Inc. and HPE, VMS Software Inc. are not able to distribute VAX licenses.

so go fuck yrself, i guess. it breaks my heart a little that these elements of computing history are so thoroughly abandoned and lost. partly out of nostalgia, to be sure, whomst among us doesn't yearn for when things... at least, we thought they sucked less. in some objective ways, they did. but goddamn, it sucks to see old ideas get implemented in worse ways--or, even more annoyingly, old ideas get completely ignored in favour of some shiny bullshit that isn't even half-baked.

sic transit gloria, and all that shit. keep circulating the tapes.


cathoderaydude
@cathoderaydude

To expand on what Liz is talking about, I think: One of the even bigger (literally) tragedies of retrocomputing is that it is mostly constrained to individual-scale machines.

The Living Computer Museum in Seattle closed during the pandemic. It seems unlikely that it will ever open again. While I had some Doubts about many of their curation choices, they did have an entire floor of interactive exhibits. Most of them were microcomputers, and it was very cool that you could use a Lisa, a Sun, a TI 99, a Tandy CoCo, and a NeXT in short order, with your own two hands, at your own pace. I know the preeminent collector of Xerox graphical workstations, and yet the first time I ever used one was at the LCM, not at his house.

However, what was even cooler was the Mainframe Room. A door at one end of the second floor led into a loud, cold room with raised flooring, in which a variety of machines from CDC, IBM, and others lived. There were terminals along one wall which would let you log into these systems, and in fact, you could even telnet into some of them from home.

You couldn't touch the hardware, there were big signs to this effect everywhere - but it didn't make sense to touch these machines, and this is germane to my point.

An IBM System/360 is not a "device," unless you also think we should apply that term to, say, a steel mill. Yes, iron comes in one end and steel I-beams come out the other, and it all works together, but undeniably it is a collection of many devices, gadgets, and machines that collectively accomplish a goal.

I don't know which is more complex: the infrastructure behind a steel mill, or that behind a mainframe computer, but I know that neither one can be casually reconstructed.

If you find a Sun anything lurking in a disused office at your university, there's a good chance you can bring it home, plug in an IEC cable, possibly a monitor you already own (or one you can get on eBay for $400 shipped), and boot it right up. Perhaps the hard drive dies, but you can see the firmware go, and maybe replace the HDD.

You won't find a System/360, and if you do, you will not be able to get it working. I have only ever heard of maybe a half dozen people under 60 who would even know where to start. They are extremely complex, extremely specific, and extremely big. They also did not get casually discarded or forgotten in offices - most of them were deliberately sent to the trash, because they were simply too massive to keep around once they weren't needed anymore.

There were countless incompatible revisions of everything. If you want a working mainframe, you will need to scour high and low, far and wide to bring together hundreds of parts, largely unlabeled except for line noise like "10-1058A." Even the cables are so obscure that you will have to build them, not buy them. Some of the connectors were custom.

Nothing will work. You will need to read schematics in manuals that are not on the internet, then do board repairs at the component level. You will need to test single transistors and know how to tell if one is "injured," because they will not necessarily be entirely dead, nor will it be practical to bulk-replace them. You will make hours-long repairs that do not fix the problem. You simply cannot tinker one of these back to life; you will need to become a genuine expert.

You will end up with at least one full 19" rack; probably several. I know someone whose entire garage is currently occupied by a single computer. I do not believe anyone else has one of these. There is no room to move around it. It does not work, and if it did, it would require a 480V AC supply at an amperage that would be challenging to obtain at most warehouses.

Once you've done all this work and you get the thing to boot up, congratulations: it's very, very boring. Even the people who are into these things struggle to make them do anything. I was going to add some adjectives to that sentence, but that's really it.

Among one of my retrocomputing friend groups, the joke is that mainframes were for printing invoices. This is basically accurate. Generally, you can't sit down at a mainframe and "open a program," and if you can, it's going to be unspeakably austere: a blank terminal with a blinking cursor and a couple meaningless numbers at the top and bottom of the screen, considered a UI masterpiece for its time.

Most of the programs are also gone - 99% of the software that ever existed for these systems was bespoke, never left the company that developed it, and even if you get it, you'll probably be in the "Disk 5 of 10" situation, where it depends on external systems that no longer exist. You might manage to get a hold of the tape labeled "LITECORP ACCNTNG 1980 V1.6", but you aren't going to get the blank database that was handcrafted when the thing was first written, without which it can't run.

And even if you got all that... it probably just prints invoices. That's what these were for. All your bank branches or insurance agencies send a tape once a month with the output from the minicomputer that runs the terminals at everyone's desks, and then a clerk reads each one into the machine, it slurps up all the customer records, and then a "chain printer" begins spewing (literally) bills that will later be stuffed into envelopes and mailed out.

Very probably, this system had no UI other than "INSERT TAPE TO READ." If the software barfed, it probably dropped a physical trouble ticket* and a "Systems Analyst" (midcentury term for "devops thought lord") would either stare at the raw database or launch a debugger for some horrifying sludge language like PL360.

* This is artistic license - I don't know that the physical "ticket drop" ever occurred outside of the phone company.

I have very little firsthand experience with these, I admit, and I'm conflating stories from other people. But this seems to be the consensus among everyone I've known who used this kind of gear.

Likewise, supercomputers - a word that has inspired awe in nerds for decades - are absolutely mind-numbingly uninteresting. A supercomputer is basically a device which accepts 10GB of integers, sits spinning its fans for two days, and then spits out "10.582338." Scientists hoot and holler; this means something to them, but to nobody else.

Basically: retrocomputing mostly orbits devices that are easy, convenient, and immediately satisfying to collect and restore. In much the same way that people collect cars, but not so much semi trucks or locomotives, which have to be preserved by Organizations, I don't really see much discussion of what is to become of Big Iron. And then there's stuff even further beyond that.

There's that whole Youtube series about restoring the Apollo computer - which I think tailed off in viewership massively after one or two eps, because it turns out that, yeah, you talk to it in line noise and it just spits out a bunch of red LED digits. you can't, exactly, "play" with it, or even "use" it for anything other than landing a spaceship.

It's good someone's fixing that up. But let's go even further: Who's preserving the bowling alley computers?

I've been thinking about this for literally decades. Something drives all those monitors over the lanes that show the scores, and then the skiier wiping out when you whiff it. Nowadays they might be dedicated devices, but we can be absolutely certain that in 1995, there wasn't a video playback unit in every single one - and in 1990, there wasn't even a computer in every one.

They had to be terminals. But what kind? Obviously they were graphical, and they had those little custom keyboards. There almost certainly wasn't one dedicated computer per lane to talk to those things. My guess is that the screens were "graphical dumb terminals", if you will, and in the back room there was a minicomputer with 50 serial ports, half to talk to the keyboards, half to send proprietary drawing commands to the displays.

And then the video clips? Where did those come from? My guess is: Bank of five or six laserdisc players that get switched through to a display via some obnoxious serial-controlled matrix switcher whenever a clip needs to play.

The input from the lane sensors has to be some nightmarish spiderweb of 22 gauge wires tied into gigantic bundles that run from the mechanics into the backroom and terminate into some horrific 256-lane GPIO module.

And nobody knows anything about all of this, as far as I can tell. I'm making shit up from whole cloth because there are zero webpages about it. And as old lanes go out of business and get demolished, or upgrade to newer gear, you can be sure the old stuff is just being goldscrapped.

So what's the point of this rambling post? This: We are losing more than we are saving. How should we feel about this? What should we do?

Well, you can try to Paul Allen (founder of the living computer museum) the problem away. Become rich, or inveigle yourself with people who are rich, and begin traveling the world, collecting every scrap of every single machine that has ever been made and packing it all into a series of warehouses.

But then, where should you stop? If your warehouse contains System/360s, shouldn't it contain bowling alley computers? Why not traffic light controllers? Avionics from mid-80s Boeing jets? What computer isn't worth preserving? You can either keep absolutely everything, or set an arbitrary line somewhere.

Keeping absolutely everything is problematic. I've been to the warehouses of people who did this. The machines sit. Nobody is using them. They are inconvenient to dig out, and in uncertain condition. No one has the energy, time, and resources to get them all working (not to mention the fact that much of "fixing" old machines is really borrowing from peter to pay paul; eventually we will run out of machines with donor parts.)

If you do pick an arbitrary cutoff for what to keep, well, you're probably going to have to go with "is it interesting / unique / relevant." Can anyone alive now relate to it? Can it inform us in some way about the past, or give us ideas about how to improve the present? And can nothing else provide the same stimulus?

The problem is that once you do this... you're probably going to throw out a lot of those Unix workstations. What makes them special? Often, not very much. They're mostly "crappy linux." They mostly had very similar commands and similar UI, and mostly ran software that was either very austere, or "version 1 of a program that was ported near-identically to Windows NT in version 3." I have used several of them - SGIs, Suns, HP Apollos, AT&T Unix PCs, to name a few - and even the people who knew these machines intimately shrugged when I asked "what can I actually do on here, what makes this special." They admitted readily that the answer was "nothing, it kind of sucks."

Of course, one option is to simply accept all this in a "cosmic truth" sense. Yes, we are only preserving a fraction of what was made - but in 50 years, it will very probably all be dead. Why waste our lives struggling to preserve a past that is simply decaying? Do we really get that much out of it?


FoxBall
@FoxBall

shit like this is extremely sad, i like retrocomputing and would absolutely grab a terminal or two if i could afford one, but by far my biggest interest is in these very large systems that are impossible to get, and impossible to get and keep working.

with the obvious exception of the really old stuff with clacking relays or big visible tapes and perhaps the ambience these kinds of systems create. for me at least the interesting bit has never been in the actual executing of the program, but instead its in how these things were made and used. what did they do and why? these things were feats of their time but the parts that actually make them feats are almost never shown! even a museum cant really show you these kinds of things or you'd be looking at a single artifact all day

all this really does make me wonder though, what even makes these things interesting to y'all anyway? the overwhelming majority all these old systems id never even touched, probably never will and were built or even decommissioned before i was even born, and yet, they still captivate. why?

sad to hear about the lcm shutting its doors, its been a bucket list destination and is pretty much the only reason i had for seriously considering visiting the us again

this is quite possibly the most depressing thread ive read so far on cohost, i want to go and against argue some point, but there isnt a single point by any of you that is even slightly wrong. that said, if anyone actually does have any old avionics they dont want any more, i'd love to have a piece or two.


cathoderaydude
@cathoderaydude

what even makes these things interesting to y'all anyway?

You just activated my Trap Card!

The conclusion I came to nearly ten years ago is that we don't want an SGI workstation, we want to need an SGI workstation, for our job, in its heydey.

An excerpt from a script I never got around to producing:

Suns and SGIs are considered cool by a lot of retrocomputing people, but if you ask me, they got that status entirely because they were once part of something cool - they were making 3d graphics and serving webpages at a time when those were extremely hard things to do, and so the people working on them were making history. Most retrocomputing people are probably unaware that that's why these were trumpeted as unique and special for decades, and that's why it's not often acknowledged that they really aren't very different from ordinary desktop PCs that would come out just a few years later.

SGIs did not produce remarkable 3D graphics - they did the same stuff your PC could do in 2003, just a decade earlier. I would guess that you don't really want to own a machine that has almost no software other than a couple of 3D packages, with which it can make graphics on par with your PC 18 years ago - software packages that were all ported to Windows NT, virtually unchanged, a few years later when x86 caught up.

What you probably want a lot more than that is to have Been There, in 1993, pounding Jolt Colas and making the dinosaurs in Jurassic Park. And the part of that that happens on the computer isn't that special. You could do the same thing on your PC now, but you would need to have 3D modeling and animation skills, which you likely don't. And for it to mean anything, you would need a project - a director, a script, and so on. By itself, the machine just makes polygons, and not very good ones by the standards of just a few years later.

There are many differences of course between those and [the video mixer I was covering in this script,] but like the SGI workstation, it wants to do a job that it can't do anymore.

This machine wants to make television. To see it do that, you need to have the skills to use it - but you also need the television. The actual show, the script, the director, the actors, the cameras, the sets. I don't have those. So I can show you what it can do, but I'm not showing you what it did, and to me that's a huge bummer. This machine was sent to me by someone who once used it to make actual television, and I wish I could make it do that.

I am not saying, by this, that you literally want to be a VFX artist in 1993. Perhaps you have no artistic inclination - neither do I. There wasn't a divergence point in my life where I could have become a VFX artist, nor do I wish that was the life I was living.

What I'm saying is that these machines were There. They were present in a remarkable time in history, doing something then-incredible, and that emits an irresistible energy to a particular sort of mind, imo, ime.

We do not wish to be writers, but we wish to have written a book. To have been there, in the thick of it, to have the war stories - or to be creating those war stories now. Because, after all, how many of us like our jobs? How many of us feel like we're doing something remarkable, something memorable?

Some of the replies within this very choststream address the fact that a tremendous number of the people in this greater demographic are tech workers who are doing nothing of consequence, and know it, and hate it. Many of us know our work will be literally thrown in the trash at the whims of our employer, and in the meantime, it will mean very little to anyone.

It is a tragedy of the modern world that so many talented people have to make a living by helping the rich get richer, rather than contributing to the greater good of humanity, or even just indulging their own interests. But wouldn't it be incredible to have been there? For... anything? To be present at the launch of something historical? To work on a project that's a decade ahead of its time?

Or... just to do something that's not easy. Not in a "effortless" or "unskilled" sense, but in the sense that it's not a foregone conclusion. Sure, writing complex software or doing SRE or whatever can be hard, but it's going to get done one way or another. If not by you, then by someone else. Perhaps worse, but still functionally. Any inefficiencies will be overcome with additional hardware; any inaccuracies will be overcome by not giving a shit.

But as you go back, you pass through eras where many tasks were accomplishments just by nature of having happened at all.

Those SGI workstations made Jurassic Park. They made Babylon 5 (I think.) They made Star Trek: Voyager. They made Toy Story (although I think they rendered it on NT boxes. do not quote this.) Even if none of us have aspirations of making 3D movies, we still recognize that these were incredible events to be present for. Even if the art was going into some forgettable trash fire of a movie, the wizards running the machines were performing the impossible, and the machines were their spellbooks.

I don't mean to speak for everyone. I am making an example which I think a lot of people will understand. These are specifics, but I think in general, we find it appealing to imagine a Computing Experience - something we all have a connection to already - that was, at it's time, not Ordinary.

Using a Mac in 1984 was wild. Using a TI 99 in 1979 was wild. Using a PC in 1981 was wild. Using an SGI workstation when those were relevant was wild. And using a mainframe in 1968 was wild.

Not all the questions were answered. Not all the possibilities were known. Not all the impossibilities were known. Some things couldn't be done! Oh, there are still things that can't be done, but they're big. They're mathematically- or logically-provable impossibilities or impracticalities, like the post from a couple weeks ago about how every function you can request from a database is either impossible or slow.

If you were an Analyst in 1968, and you wanted to calculate some figures on an IBM mainframe, derived from millions of statistical records, and then produce a graph to be displayed at a conference, you had to:

  • Get them into the machine. How? Punchcards, generally. Someone is going to have to key all those cards. That's a ton of man-hours, can it be reduced? There might be entry errors. The cards might read wrong. How can you prepare for these possibilities?

  • Process the data. You're sharing the machine with other users, your task requires human-scale amounts of time that could range from minutes to hours, and there is limited multitasking (on some machines: none whatsoever.) You can't just tie up the machine indefinitely, so you need to work hard to make your software efficient, or it might not be done in time for the conference.

  • Output the results. Probably to a line printer, at which point someone needs to take the figures, manually plot a graph on a drafting table, and then photograph it to make slides.

At every step of the process, things could explode, in some cases literally. There is no "ordinary" in this job. You need to be attached to the project all the way through, babying it, making sure all goes well, ready for anything, and when you're done, you know that actually finishing this work was not a guaranteed outcome.

This is an exciting job. Ask people who worked at the phone company about their job - were they passionate about phones? Did they really care about communication? Maybe, but they were too busy to care. Busy with new tasks, unknown challenges, new problems, all very concrete. Not made-up bullshit, not busywork, not solved problems unsolving themselves for no reason. When a phone switch (or a mainframe) broke, it wasn't because of malfeasance, it's because making things was hard. They were living at the cutting edge.

I believe that this is at the root of much of our fascination. We live in an era of solved problems. The low-hanging and medium-distance-hanging fruit is entirely picked. If computers are your job or your hobby, you're probably going to spend most of your time reinventing the wheel. If you're doing anything really novel, anything with an uncertain outcome, you're probably working at a level so high that the overwhelming majority of people can't really relate to it. But there was a time when simply Being On The Computer was, in itself, a remarkable feat.

There are many generalizations here, of course, but this is what I feel is going on, based on ~20 years of observations and personal reflection.


coryw
@coryw

hey super quick sidenote is 1700 words

the TL;DR of my addition is: I once had an SGI Octane that would've cost $30,000 in 1997. i ran SGI's graphics framework demo on it and liked it. i connected to it with X11 and my mac laptop that was $2,799 in 2002 ran absolute rings around it in graphical performance. just, absolutely shameful trouncing.

and a couple other notes.


super quick sidenote type of thing.

as a child, i was super into these machines. my dad had access to some of them at work (although it was mostly in service of network operations) and they were mentioned in the Mac magazines as the mystical next level of computing. in that context they were often framed as, like, the servers you wish your company would let you buy.

as a child, i was obsessed with this things because i had this unshakeable conviction they were somehow better than the computers they let "normal" people use. like UNIX was the answer to all the problems i had with my macs.

this didn't have the effect, in retrospect, it should have, but as a kid i actually owned a couple of these things. an SGI Indy in almost fully tiptopped shape and then an SGI Octane, although in much more modest configuration.

the Indy is notable for having been pretty bad at "video capture" which was the use case I imagined it. part of tiptopping it meant putting in the 24-bit graphics card, as well as the CosmoCompress video compression board and the video capture board. SGI included a rudimentary NLE with the system but my system simply didn't have a big and fast enough disk to do video capture and editing the way my blue powermac g3 could.

sold that machine and got the Octane.

the Octane was fun because it came from the National Institutes of Health in Bethesda and had a few The Cars MP3s on it for whatever reason. Nevertheless, I reformatted it with some help from a new friend and got to Puttering. In 2005, there was a surprising amount of community support for SGI MIPS computers running 64-bit chips and IRIX 6.5.22 (or later if you could find someone willing to get you the ISOs or mail you a copy...) so I put firefox, openoffice, blender, lightwave, photoshop, illustrator, maybe premiere, and a couple other things on it.

the thing that stands out to me the most was Silicon Graphics Performer. Performer was SGI's "visualization framework" type of tool. you'd use it to make scenes or walk-throughs. You could do CAVEs and 3d graphics with it. It may even have been possible to do glasses-based virtual reality with it. kind of like Unity, I suppose, except that you can use Unity to port thigns to iphones and playstations whereas Performer had a much heavier "demos on the big iron" slant.

anyway, one of my favorite Activities(TM) was to launch Performer and run the town demo. This basically was a simply rendered town (think: N64 graphics but way worse, slower, simpler) with a bunch of things happening. you could free-walk or attach yourself to any of the objects.

it ran at a few frames per second. i wasn't too unhappy with that because although it'd been around a $30,000 or so configuration in 1997 or so when it was new, it was a fairly modest config. i couldn't afford the bigger graphics card with texture RAM or geometry engines, so I was running what's best described as "4-meg framebuffer suitable for coding".

at some point it occurred to me to pipe the demo to my mac via SSH -x or the equivalent rlogin command.

oops!

it ran at like a hundred frames per second on my laptop. over 10/100 ethernet. via remote X.

ultimately (in like 2002-04 or so, so slightly before i had the Octane) SGI went to Radeons in the last generation of its Onyx "graphical supercomputer" systems, but by 2003 SGI was working on Itanium-based systems before ultimately going under because: why spend a hundred thousand dollars on an SGI Onyx that has the same Radeon you can put in a Pentium 4 or a Mac G4. at that point, most of their "graphical supercomputing" customers likely weren't really using capabilities that a big Onyx or Origin had that a desktop computer with an AGP slot didn't, but for the "scale-up HPC" customers, they were building an Itanium system running Linux.

secondarily to all that: almost everything about really using the Octane kind of sucked. in retrospect, i have absolutely no belief that getting a faster SGI would've made most of those things better.

sure, if you get, say, a tiptop SGI O2+ and spend a bunch on a really fast disk for it, you can capture video better -- but you can already do that with an iMac you could've bought for $1,299. (the O2+ even postdates that iMac and still cost a minimum of $10,000 in "reasonable video capture" configuration, so...)

IRIX has almost everything about everything in all of these posts harder because unlike, say, HP-UX, it actually had a fair amount of what you'd argue are "recognizeable" commercial software titles. (i mean, Internet Explorer 5 launched for HP-UX and not IRIX, but i namedropped some Adobe software, Maya was on IRIX, Shake (which Apple later bought) was on IRIX, several discreet and avid tools were on IRIX, Mathematica was on IRIX, so-on and so-forth.)

that's all above and beyond what SGI bundled with the OS, which, to their credit, was actually a lot! SGI bundled a graphical composition tool (AMAZING oral history here: Making an “SGI-Quality” Presentation | The Real McCrea for some more notes about this tool), a video editor, netscape, and a couple other things.

it's credible to imagine you could have daily driven one of these things as an individual at home if your work allowed it and/or if you were doing it explicitly to be on the cutting edge.

but i don't think it would have been good.

in a lot of ways, i think DIGITAL (and later Compaq) had it right trying to get NT ported to their platform.

but ultimately: a computer is its applications. outside of computers that were sold as graphics appliances like the SGIs, most RISC UNIX machines were sold as money-counters, (IBM/hewlett DECPAQard), devops machines, infrastructure appliance (some of dec/compaq, sun), or platforms that exist nearly exclusively to run some type of bespoke code. Not just "accountingsoft v1.6" but your owrganization's very own home grown tools.

or you find them as part of appliances representing bigger investments or a cheap-easy way to computerize. e.g. a desktop AlphaServer and a pile of terminals is a credible card catalog for a regional public library. HP and IBM sold several of their "business" systems on the idea that you put a midtower-sized box under the desk of your admin assistant next to the printer they manage, and then everyone who needs to use the thing has a terminal.

ultimately i'm not sure that any of these vendors wanted normal people using them. i don't think they cared about any presumptive home computer market, or even as office computers. UNIX vendors that did (at least until Apple got Mac OS X certified -- on Intel only) were a minoirity.

as a university student in the late 1990s or early 2000s you could buy yourself a RISC UNIX workstation for like five thousand bucks. (OSnews even had a review from someone who did, that may even have been Thom Holwerda himself but I'd have to go find it.) You were supposed to them for Computer Science studies. an English major or even an "information systems" major wasn't supposed to think about buying themselves a RISC UNIX box.

there's no point to this other than to say that although it took a while, i was ultimately radicalized to believe that the interest in "big iron" of any type is almost always exclusively couched in the physicality of the machine. i wanted desperately to one day upgrade to an O2+ or even the vaunted Tezro because they were sexy and unique, not because they were going to be reasonable or even remotely "good" computers for anything i needed or wanted.

(there's a Post somewhere about how NeXT actually did try to build a UNIX workstation they thought "normal" people would want, the UNIX Computer For An English Major, as it were, but this is already too long....)

one of my dear friends actually made extensive use of SGI MIPS IRIX machines basically right up until "support" for IRIX 6.5.30 ended in 2011 or 2012. She compiled software, put a SAS card in a 2u Origin server, hosted her web site on it, ran an IRC client on one. as the web advanced, Firefox 2 became progressively less reasonable, but she was there until basically the official end. (hilariously she used her powermac g4 basically until the end of security for OS X 10.5 as well.)

i've been meaning to write about this separately but this is ultimately most of what saved the Mac: it had software for a wide variety of users on it. OS X and picking up some of the pieces of the dying RISC UNIX Workstation market helped a little bit, but ultimately "having MS Word, QuarkXPress, Photoshop, as well as Number Munchers" took the platform really far. if Apple had died, Macs would have had an exceedingly long long tail of usage, and it would've been significantly more deserved than what some other platforms got.

.... anyway.

i would like a RISC unix box again (he wrote purposefully ignoring the digital alphastation and sun sparcstation on the shelf) but even in a perfect environment i simply have absolutely no idea what i would do with it that i couldn't do in a debian or ubuntu VM on my server, or on an old or new mac, or even if it came down to it, installing linux on a physical computer.


You must log in to comment.

in reply to @cr1901's post:

I read this article too and had a similar reaction. I play a lot of old games, but I hardly own any vintage hardware greatly preferring modern software emulation solutions and retro remakes and collections.

The author really lost me at trying to install Pro/E especially as someone who doesn't have the design background. I did my Sr. design work on what were then brand-new Sun workstations in my college's engineering lab and started my career on Pro/E and SolidWorks and do not have any nostalgia or affection for turn-of-the-century CAD programs. If you want to poke around with a CAD program, Fusion360 literally runs in a browser and you can hit CTRL+P to print or send something to a 3D printer instead of trying to find another set of drivers for a printer that won't work.

I think that's the difference for me. Retrogaming - or even retro productivity software - in my mind is distinct from or vintage hardware as a hobby.

I was a newly minted engineer at PTC when it discontinued the HP-UX version of Pro/E. It was one of the last Unices still supported by that software, if not the last one before it became Windows NT only.

The engineers supporting that version were relieved to see it, or any of the other Unix versions for that matter, shut down.

Supporting all those different systems was a lot of work, and from what I was told (I was working on PTC’s Windows software), and vendor support, at least in those last days of commercial Unix, didn’t do enough to alleviate the pain.

in reply to @selectric's post:

I had the opportunity to own a VAX in 2011. I didn't take it because 2011-me was a complete and utter moron who was only interested in DOS machines. I.e. I didn't know better.

If I get a VAX, chances are, I'll run NetBSD on it because I'm more interested in trying to get old silicon to do new things. But I fully sympathize with those who don't share my interest. I'm angry for those enthusiasts who have to jump through great lengths to use machines and software that vendors have left to rot. Through no fault of their own, those retrocomputing enthusiasts are left with machines that can only use a fraction of their true power.

Also, to be perfectly fair, even in my "use NetBSD" approach, it would take effort to utilize the full power of these old workstations via reimplementations, emulation layers, or brand new software. But the FOSS kernels and FOSS userland are a starting point.

in reply to @cathoderaydude's post:

ah shit, I remember the Server Room of the Living Computer Museum. It made me cry a little seeing a bunch of old computer still doing stuff and running and basically singing to themselves. hell it's making me tear up now just thinking about it. the museum's closure is an incalculable loss

yeah, that's basically exactly where i was going. annoyingly, i find myself more fascinated by/interested in the mainframes, the big stupid Systems with a capital S, more than i do the stuff that's self-contained.

probably why i'm so fond of the connections museum, tbh: you can see the tangible results of what the Big Damn Computer's doing.

I don't have a good response to the good points you bring up. I guess my initial post is based on the fact that: "Yes, I get angry too if I have the technical know how to fix something, but can't due to factors beyond my control (which, yes, is usually not enough people caring)."

This also excludes preserving mainframes and large b/c that is not usually a single person job. And it always feels like there's not enough interested people to collectively focus their bandwidth into preserving one of these old unloved computers.

I wonder if there's lessons to learn from people building the PRR 5550? Yes, a group of enthusiasts are building a f***ing steam locomotive in 202x.

I think part of the problem there is, preserving -- or building from scratch -- a steam locomotive is a whole different animal from getting big proprietary computer systems up again.

A steam locomotive can't really be DRM'd. (Though I'm sure they would've, if they could!)

You can pull it to pieces, measure things, make new ones with relatively simple operations (even if they are on a large scale), and there's not really anyone who can tell you not to by way of legal impediment...

Metal is metal, that bit's relatively easy! When you have software involved, where it was locked up tight with proprietary bullshit, and may never have even been available on physical media if it was fetched remotely... where do you even start unravelling that ball of tar?

It's a grim problem.

I used to be enamoured with SGI machines. Used to be.

A busted one passed through my hands and I kick myself for not keeping hold of it (an Indigo 2000, in teal); and I have an Octane 2 just ... collecting dust.

Because, really, even if you go through all the trouble of finding a monitor that'll talk to the weird-ass display output that you have to make a lead for, hunt down the IRIX CDs, and all the various software packages, and set yourself up with stuff you can actually run...

...you've got a unix box with CPU and graphics that were astounding at the time, and you would've paid $40 000 (or more!) for at the time, but now get walked all over by a $1000 or so x86 box.

And, yeah, the x86 monoculture is bad, like all monocultures are; and it's nice to look at how The Other Half lived with their immensely heavy & stout chassis, easily swappable modular machines (yank the PSU with two screws and a built-in handle, straight out the back, without opening the case; ditto the graphics card, etc.) but at the end of the day...

...

...you just have a big, heavy, noisy, power-hungry, slow, inconvenient, unix box with a bunch of boxed software you can't even run any more because the licensing stuff is all dead. And that's cool, if that's your thing; but once you've exhausted the things you can do with it that were unique back in its day, that's kinda it.

(And that Octane 2 is now refusing to netboot even the bsd RD that I put on it last time, so something's fucky, and I'm not really into Computer Touching™ as a hobby any more, so it's gonna stay like that 'til someone else takes it, I guess.)

They do look kinda cool, tho. And part of me is sad that the history, and the software, and the knowledge around them is blowing away in the digital wind; especially as the folks who used to work with 'em Recover...

Also pouring one out for The Living Computer Museum. I feel blessed for the two chances I've had to visit it and deeply saddened I likely won't be able to show it to friends that I know would love it.

I honestly believe our best bet at practically preserving really obscure hardware is going to be FPGA-based solutions and other types of modern hardware recreations. As long as we have some reference hardware and documentation to learn from we can hopefully do our best to recreate it in form factors that are more compact and easier to maintain. Of course in it won't be the same as actually physically handling, say, a punch-card reading mainframe. And this says nothing for preserving the software that they actually ran. But people can still learn from it and I think that is at least valuable.

One of the reasons I love retro computing is because older machines felt less like computers and more like machines with impossibly small levers and dials. You can't touch them with your hands but you can write commands to make little electrons push them around for you. I think that makes them an invaluable tool for learning computer science and having a better understanding of what your 32-core x86_64 beast is actually doing under-the-hood at the end of the day. I find that useful even as I write high-level C# code.

It's not really about the invoices that get printed so much as the micro (or not-so-micro) city inside the metal box that was built to do it. About how it's wild electronic infrastructure works. What old engineering can tell us about the problems people were trying to solve, what path created modern machines, if anything can be gained from old ideas that may have been ahead of their time, etc etc

I think it's worth the trouble whenever a good opportunity arrises.

i think a lot about the way that america, when it does do historic preservation at all, usually does "preserve in place" type stuff; an old building or other piece of significant infrastructure sits where it's sat since the 1800s never to be moved again, even if its a genuine inconvenience to have it there (and not just in the "inconvenience to developers" way, something that causes problems for people rather than ghouls)

and then i think about like, the ise grand shrine in japan which is rebuilt every 20 years. stuff like that, where the spirit of the thing is more important than the specific physical instance of it that's in front of you

tying this back to retrocomputing, i can't help but think of the weatherstar units that the weather channel used to use for displaying forecasts and such they were big physical rack mount units that would be kept at cable headends and hooked up to all sorts of proprietary data via analog satellite feeds and the like and theres been impressive work to restore the 4000, the one thats most become a brick in the modern day, to functionality. the software was downloaded on boot from a satellite feed! it's been gone for at least 8 years, since the last ones were taken offline and replaced with intellistar 2 junior units that use digital data sources, back in 2014. theres guys who have rebuilt the software from scratch, making something that can run on those ancient, ailing boards and almost perfectly imitates the original stuff. it's incredible. its a massive achievement. but it's also not gonna last forever, since those last surviving machines will die eventually, and in the end its not something that can meaningfully be emulated as it existed. it's gone! it's all gone, signal lost in noise.

so the community thats interested in the weather channel and the 4000 specifically created simulators, rather than emulators! its something that imitates, as closely as possible, the observed behavior of the original device, but can be run in a window or fullscreen on your computer. its one of the most fascinating pieces of software ive seen because i have rarely seen "simulation" done in a useful way. plenty of simple "X in CSS/JS/HTML" type deals, but never something that can actually perform the functions of the original thing without emulating.

i guess what im saying is that i think like, keeping a physical version of a device is important, and emulating them as they were, needing arcane configuration, is also a fascinating alternative to that. but i think simulations of "how this would have worked with that software, if configured properly, at the time?" can be quite interesting and potentially historically useful in their own way. i know that doesnt make the old hardware any easier to work with, but...

I didn’t realize the LCM closed, RIP. When the Roguelike Celebration conference was last live in 2019, we brought in a bunch of VTs and had them run the original Rogue (and Hack, iirc) by connecting to the LCM’s PDP-11. It was absolutely a highlight of the event.

in addition to all the above things mentioned, there's also the lesser-known fact that, when Rackable Systems bought SGI, they destroyed everything in the warehouse that held all of SGI's documentation, code, etc. - from what i've heard this was entirely to keep them off the hook from having to provide support for anything that SGI had made before, so they could keep just the brand name and no other obligations.

which really, really sucks for those of us who care about SGI and were trying to figure out if it was possible to get the current owner to release code and docs under an open license. now, even if we got them to transfer the rights for IRIX and the systems that ran it to us, the only option left is reverse engineering.

my deep passion is preserving/collecting electronic organ consoles. i can't get into pipe organs because of just how much space you need to store one. i'm already getting a little over my head in terms of storage of electronic consoles. you also run into the "it kind of sucks" problem when you've restored some of these consoles which weren't particularly remarkable except for maybe the method of generating sound, or that very few were made. some of it can be turned into business if you get good at restoring/repairing Hammonds, which are still in demand, but now the digital clones are getting really good, and even a purist like myself is like yeah, I'll take the 30 lb keyboard instead of the 300 lb one, my back hurts and gigs barely pay $100.

And similarly to these other fields, you can find an awful lot of people who will take a hammond organ off your hands and give it a good home, at least for a while. They might not be able to fix it, or use it, or show it to anyone but they'll at least go, damn a hammond organ, I guess I can make room for that.

Well, what about the crappy consumer ones? What will become of those? Some really are unimpressive, but there are some very cheap and cheap looking ones out there that make really cool sounds, stuff that hammonds can't, but the overwhelming majority of people are going to look at them and see trash. No brand recognition, no cultural cachet, often means that when the thing finally gets thrown out by whoever currently owns it, there's a good chance they won't even bother putting it on craigslist, just unceremoniously throw it in the trash. Or if they do list it, nobody's looking for that name. Or it's in the middle of Kansas, these things can't be shipped, and there's just nobody around for 200 mi looking for that thing by name.

And obviously we can't preserve every scrap of everything that has ever been made, but, it's always sobering to think about how rudimentary the selection metrics are. What people choose to preserve is decided by factors very different from their merits, to put it simply

absolutely - i am at the point where i'm trying to semi-permanently loan consoles out to friends. i haven't let myself search "organ" on craigslist in a long time because i'll just find more cool stuff that i can't bear to see scrapped. i'm now known as "the organ nerd" so consoles often find me. my arbitrary preservation parameters are often trying to find the flagship model of a console line; if it's not a Hammond i want it to have two full-size 61-note manuals and at least 25 pedals. i have made some pretty silly road trips from the PNW to save some of these (which is admittedly part of the fun) but then i end up with this giant console and even if i track down a service manual it's like... something specialized for church or theatre organ music, which i barely even play (i have a performance degree in jazz piano). but i feel compelled to do this and feel i am preserving some kind of beauty with these things

edit: they CAN be shipped, sort of. had to miss out on one i've wanted for years because it's in Pennsylvania and i'm in Oregon and the seller couldn't make things line up and found a local buyer in the interim.

in reply to @cathoderaydude's post:

in reply to @coryw's post:

yes hello pls gib post on NeXT's UNIX for an English Major, I'm so here for that

and yeah I really do wonder what would have happened if Apple went bankrupt, say if Microsoft didn't invest, or if they went with Be instead of NeXT and it went badly... would there still be G3s running around because people refused to move to Windows? User groups trying to keep this zombie platform alive?

There's still a community of people keeping PowerPC Macs alive. Until recently there was a Firefox port (someone may have even brought it back form the dead?) and also in the last couple years an enhanced PPC version of Snow Leopard (Sorbet Leopard) came out!

Sure I own one lol, I have an iMac G4. I don’t run OS X on it though, I have it because it’s one of the last models that can run OS 9 natively, which is more what I was implying above. What if Apple failed before OS X released? Or even shortly after it did, when it was still considered slow and bloated?

I would like to try Sorbet Leopard at some point. There was a thread about it on the forum a few months (?) ago and at the time it.... sounded like it didn't really deliver on any of its promises. Rather than being Snow Leopard ported back (there is a PPC build of Snow Leoaprd but it's a super early beta) it's some customizations on someone's own Leopard 10.5.8 install, alleged toward improving efficiency, but uh...

The recommendations for what runs Sorbet Leopard "well" are significantly higher than Apple's own recommendations for retail Leopard and unfortunately "boots Mac OS 9" and "runs Leopard well" were already completely mutually exclusive.

TIL that my iMac G4 is less than 100MHz below the recommended specs for Leopard haha. Still kinda want to try pushing it sometime, but I'm not in a hurry. I'm still trying to get the original discs for that computer, I got it wiped and haven't been happy with the kinda janky OS 9 Lives install I put on it.