discord: 0xfffe
elephant: https://hachyderm.io/@pastels
24.24.2.287


this post led me to softwarecrisis.dev, i guess its mostly chapters from a book they wrote? but i noticed this one specifically and it resonated with me:

The current trend is towards the aesthetics of correctness. Everything has to look like it has strong or static typing. It doesn’t have to really have static typing. That can all be made up after the fact in a declaration file. It merely needs to have the aesthetics of types. Type annotations everywhere, implementing logic through type system trickery, and forcing any and all dynamism out of the system in the name of correctness is the name of the game.
A part of this trend is the unpopularity of the approaches and languages that are seen as less rigorous. CSS is dropped in favour of statically typed CSS-in-JS approaches. HTML is dropped in favour of a strict inline XML-like markup format called JSX. Just a few years ago, everybody in web development hated and dropped XML and XHTML specifically because it was too strict and felt less dynamic and flexible than HTML. At some point, pop culture will bore of this and swing its attention back the other way.
It’s a fashion industry. Trends come; trends go. The lack of historical awareness is considered by most to be a feature.


You must log in to comment.

in reply to @sudocurse's post:

It's even weirder when you zoom out, because you basically fragment the "orthodoxy" in the '90s with PERL, probably the first seriously-used language to suggest that types don't need to matter, rather than apologizing for not having the typing implemented yet.

From that point, you have languages that have retained typing and apologize when you cross a boundary to a world without solid types. But you also have the typeless languages that have realized that, oh, when they warned that you could only find problems at runtime, that wasn't a metaphor for something. So, they've tried to reinvent the idea in a tradition where that's objectionable, and also don't want to break all the old code that brings people into the community.

There's a lesson in there that goes well beyond programming, but this comment has already rambled long enough...

I don't really feel it offered anyone anything. If you were writing it by hand it was just annoyingly verbose, if you were generating it programmatically any time you forgot to mark inclusions as CDATA or whatever it was you'd get angry red messages about invalid markup.

It did catch on, it was all the rage from 2000 thru the whole decade basically until HTML5 killed it. This was a reaction to the era of "IE is evil" ushered in by millions of websites saying "best viewed in Internet Explorer 5 or greater" (now after years of divesting and building alternatives i guess we're just shifting back to the same but its "Best viewed in Google Chrome"). If you look through alistapart.com and popular webdev listservs from that decade you can find a huge wave of enthusiasm for it for accessibility, design consistency, and a host of other reasons.