starting to realize the reason I've come to loathe modern computing is the constant sword-of-damocles-esque feeling that something you've figured out a solution to will be violently subjected to bitrot no matter how hard you try to prevent it, and is as temporary as temporary gets, with you only having six months, a year, or maybe two years before that solution either isn't viable, or no longer works at all.
I am old enough to remember when computing was functionally immutable, where your computer and the software running on it only changed with your direct input, possibly staying the same for years at a stretch. new computers and operating systems took weeks to re-adapt to—there was none of this 'boiled frog' imperceptible change constantly happening over time.
and going further backward, only a generation or so ago, this extended to Pretty Much Everything. my parents spent a great majority of their life interfacing with, and experiencing, the same things and people every day, for decades.
real neat that the treadmill of obsolescence means i can theoretically work on the same project forever without ever improving it. there will always be dependencies to update and disruptive technologies that will make my perfectly functional web app unsustainably outdated. at no point will anybody in my company say "we did it, our app is feature complete". an entire industry dedicated to keeping the lights on.
worse, reaching feature completeness, if you somehow manage it, is seen by management-and-ownership-at-large as a failure state
