• they/them

plural system in Seattle, WA (b. 1974)
lots of fictives from lots of media, some horses, some dragons, I dunno. the Pnictogen Wing is poorly mapped.

host: Mx. Kris Dreemurr (they/them)

chief messenger and usual front: Mx. Chara or Χαρά (they/them)

other members:
Mx. Frisk, historian (they/them)
Monophylos Fortikos, unicorn (he/him)
Kel the Purple, smol derg (xe/xem)
Pim the Dragon, Kel's sister (she/her)

posts from @pnictogen-wing tagged #computers

also:

Who here has read John Kennedy Toole's A Confederacy of Dunces?

It was one of my favorite books in the 1990s and I'm sure I'll love it
just as much when I re-read it (eventually) because I regarded it as a
moral warning, a milepost of sorts: Don't Be Like Ignatius V. Reilly. C.
S. Lewis talked about his moments of Joy or sehnsucht in
Surprised by Joy and I agree with him fully; such moments are
important—and Jack Lewis should have asked himself why he
stopped having them, even though he wasn't anywhere near Heaven
yet. But I've come to realize that there's a logical converse to such
moments: the times when you realize you've strayed too close to the Pit
and maybe you should back away. A Confederacy of Dunces was like
that. Reilly was too familiar for comfort. He was stagnant, soured,
morally and intellectually rotting in place, and as it turns out he also
predicted the future. The Internet is overflowing with Ignatius Reillys
and most of them call themselves "dark intellectuals" or something
similar. At some point in their pasts, as with Reilly, they decided
never to grow up: they chose some moment of dark epiphany to fixate
upon, some moment when they realized they were the only sane person in
an insane world, and they haven't budged a millimeter from that spot
ever since. I remember reading A Confederacy of Dunces in the
mid-1990s and thinking, oh gawd, let us make more use of college
education than THAT.

The "dark intellectual" people and the antisocial techbros who eat up
their stuff love to talk about their "redpill" moments, when they
supposedly realized that feminists had ruined the world or whatnot. Bret
Weinstein, who's peddled TERF diatribe and Sinophobic "theories" about
COVID-19 and is now claiming to be Saving the RepublicTM on a
speaking tour with a bunch of other propagandists, has a particularly
hilarious such moment: when he was fired from a teaching job at
Evergreen State College here in Washington State for being too bigoted,
he declared this was evidence that Evergreen was the secret headquarters
of a vast leftist conspiracy to corrupt all education or something like
that. (He's blithered about this at length and you can learn all about
it on YouTube if you like.) As it happened, Ignatius V. Reilly had a
similar moment: he bused to Baton Rouge to apply for a teaching job at
Louisiana State University, flubbed the interview, and then decided that
this experience was a trip into the Heart of Darkness of modernity.
Reilly would tell this story of dark awakening to all and sundry, and
write extensively about it into foolscap tablets in his bedroom at his
mom's house. Now, though, you can put that stuff on the Internet, and
get paid for putting it there.

If there's any ONE event that gets the "dark Enlightenment" people
worked up, though, it's the endless September, the day when the
Internet was finally too public and commercial a thing to remain the
exclusive domain of universities and .mil accounts and that sort of
thing. There was a long enough interval when the nascent Internet was
the exclusive playground of college students and military contractors
for a pecking order to develop between wise professional greybeards and
clueless college freshmen joining the party late (like I did) and thus
contributing to a September rush of "dumb" and "moronic" newbies on
mailing lists and Usenet. But then when there were enough people getting
Internet accounts through corporate outfits like AOL, round the clock
instead of clustered round the school schedule, that meant an "endless
September" of newbies at all times of year. It's quite clear that
there's a lot of rancid resentful nerds who still think of this as the
End of the World, more or less, the day that the barbarians arrived at
the gates. After all, nobody represents civilization better than a
racist computer nerd still waging Mac v. PC wars.

I'd love to kill this bit of toxic nostalgia stone dead, if I could.
I've experienced a bizarre reversed version of it: I came to hate
computer nerd culture so much that I aggressively took the part of the
unsophisticated user, partly because one of my best friends IRL is a
very old-fashioned gardener born in 1951 who NEVER got used to this
stuff even a bit and still prefers to talk on the telephone. I've helped
him out with computer stuff and shared his anger: why is this stuff so
confoundedly hostile and overcomplicated? It's not fair to make someone
like my friend deal with a labyrinth of bad choices like the modern-day
website or recent Windows versions, much less the fucking smart phone.
(He refuses to get one. Can you blame him?) "Endless September" now
seems merely like the reification of the casual bigotry of toxic
computer geeks, the ease with which they divide everyone up into the
[slurs] vs the high-IQ, more "evolved" human beings, hoi polloi
vs. hoi aristoi.

It's not like they even respect that era of computing anyway, not
really. Oh they still spout out sentimental glurge about it but in
reality they're happy to have left it behind. It's safely in the past
for them, like Napoleon or Julius Caesar, and therefore safe to
mythologize.

~Chara of Pnictogen



it's been recommended to me many times that I break myself of the habit of doing everything in the web browser, which of course is the pattern of usage that web browsers and web developers have been encouraging for decades now. folks have been pointing out for a long time now that the web browser, which seemed like something new and amazing back in the NCSA Mosaic days of early 199x—oh dear gods was that actually Marc Andreessen who did that, gross—has transmogrified into a bloated miscreation, a kind of half-assed virtual machine that for lots of personal computer users has become the only way they interact with anything, through web applications and assuming that "the cloud" will simply keep all their data for them.

do people NOT notice how telling the names of these things are? "the cloud". how permanent are clouds? do you trust information you see written in clouds? (sighs) anyway

despite decades of experience with personal computers I've never developed much genuine facility for them, thanks to the intensity of the visceral and irrational loathing I've developed for the entire industry. but loathing of such vehemence stems from feelings of betrayal: I despise modern computing because at one time I was naïve enough to put all my hopes into it. there was an interval of childhood where computing really did seem like magic (and also something I felt my father was cool for knowing something about, in his older-fashioned way) so watching that old magical promise shrivel up under corporate misrule made me feel like I'd been tricked, led astray. by 1995 or so I could legitimately feel like computers had ruined my life because of how much time I wasted on them during my failed Caltech undergrad. but even then the magic hadn't completely gone out from them and I could still hope that maybe there was a future for me in learning to program computers and make money in software.

then I moved to Seattle in late 1999 to pursue that dream, and by late 2001 I was out of the industry altogether, for good. yay me

anyway thanks to this unpleasant set of experiences I've utterly failed to develop the kind of easy relationship and swift workflow that computer geeks experience on their machines. my computing habits have been toxic ones. I've alternated between spells of manic hyperfocus and overactivity on computers (probably coming from various introjects hidden deep within the Pnictogen Wing, seizing control for some specific activity) and intervals of loathing and avoiding computers altogether, seeking the solace of friendlier tasks like reading or watching movies or cooking. and in general I've stuck to the lowest-resistance methods of using the personal computer, i.e. I've behaved like an "end user", an unsophisticated consumer of computing using a bare minimum of mass-market applications. so, like any housefrau or clerk or schoolkid who uses computers mostly because it's expected of them, I've been limiting myself to common web applications and using them in the expected way. open a browser, go to the website, type away.

that's a poor idea in practice because one of the most reliable traits of web applications is unreliability from multiple directions. even the best designed website can still be defeated by a browser crash or an Internet outage, after all, but more to the point: it's difficult for a web application to deal with interruptions properly. a native application can easily "save state" and recover easily from a crash, but a web application can't easily do that, so most don't bother. if the website suddenly bombs while you're in the middle of typing deathless prose, just like I'm doing right this moment, welp that's your fault isn't it? you should have been more careful! and anyway you should be grateful you get to do anything at all on a computer, you [slur], I bet your IQ is [get bent]. if you want something better program it yourself, etc.

I trust I've made my point. making software labyrinthine and unreliable has become almost a point of pride with toxic computer geeks, evidence of "intelligence" and a way to screen out the "dumb" people. if you're a "power user", i.e. someone willing to pour a ludicrous amount of wasted time into ferreting out and reverse-engineering the hidden secrets of software which shitty programmers like to put into their shit, then you've got something to brag about. additionally, the programmers are highly likely to be better screened from the consequences of janky and unreliable computing equipment and software. they have the money for the highest quality toys and generous amounts of free time to get everything working to their satisfaction. the ordinary user who wants simply to use a tool rather than turn a ten-minute task into a weekend project in recompiling their Linux kernel gets no respect. hence we're forced to muddle along with semi-functional software.

you'd think I'd have learned my lesson with web applications then and done what (say) my metamour Gravislizard habitually does, which is write all their posts in a text editor first. but I have yet to develop such a habit. even text editors don't seem fun any more, or pleasant to use.

~Chara of Pnictogen



There's an idea that I've been trying to piece together slowly. I've been trying to figure out what it is, exactly, that's so bamboozling and confusing about being on a personal computer. Or a smart phone, or tablet or whatever.

For a while I thought: oh, it's the unnatural light. No matter what screen technology you're using (with the exception of e-paper, whose physical nature makes it uniquely pleasant on the eyes) the light that comes from a computer monitor or similar screen isn't like the light you'd get from usual Earth objects. More and more of the light sources in widespread use in human society are "unnatural" in this sense. The mammalian eye is used to continuous sources and continuous spectra and colors that aren't too saturated. Technology is required for sources that have narrow emission bands, or which are intermittent or oscillating. So it's kinda weird to stare at a screen. Is that the only issue, though?

There's also the fact that objects on a monitor have a blurriness or jagginess that isn't usual for physical objects. Text on a screen is always a bit annoying to read, and I don't think anti-aliasing helps (rather the opposite, with me anyway). Physical objects have a sharpness of definition that's missing from texts and other objects on screen. I remember hating the widespread introduction of anti-aliasing into OS X and later releases of Windows; it felt a bit like I was being made to squint through a thin layer of vaseline spread over the screen.

But there's a more important piece of this idea I'm less clear on, because we're not good with the math and geometrical concepts necessary to understand the nature of the beast, so to speak. I'm referring in general to how the presentation of information on computer screens, in overlapping rectangles that behave in eccentric and counterintuitive ways, has created a bizarre sense of interdimensional space. One can, on a computer, slip into a realm that has some notion of depth and direction, as if one were stepping into a physical 3D space, but in fact it's a chaotic mess, a labyrinth of passageways that presents the superficial aspect of a simple screen—pixels in a plane.

It hadn't occurred to me before how pseudo-3D shooters also exploit the ability of the computer to display the appearance of paradoxical spaces. They look locally like ordinary hallways or whatever, but in fact they're self-intersecting and connected up with each other in strange non-Euclidean ways. You know, like in R'lyeh! Gamers have simply gotten used to navigating such paradoxical locations so long as they look superficially acceptable to the eye...and I'm not sure that's really a good thing.

I am reminded uncomfortably of the appearance of the Witches' Labyrinths in Madoka Magica, which have something of the appearance of proper three-dimensional spaces, with depth and direction, but which follow their own confusing rules and are dominated by flat images. I suggest that without knowing it, computer programmers have led users into a paradoxical space that is neither two dimensional nor three dimensional, a space where people can be given the illusion of progress and motion without actually going anywhere. And now a large fraction of the computer-using population is so used to this state of affairs, the ordinary world now seems wrong to them.

~Chara of Pnictogen