xdaniel

Hey there~

📜 Hobby programmer, ROM hacker, retro computers & consoles, anime & manga fan, sometimes NSFW?

🌐 🇩🇪/native, 🇺🇸🇬🇧🇦🇺/good, 🇯🇵/へた

🔒 @xdn-desync

📷 via Picrew by 🐦kureihii https://picrew.me/image_maker/1272810


⛏️ The Cutting Room Floor
tcrf.net/User:Xdaniel

xkeeper
@xkeeper

i got yelled at once for saying i should be able to disable https if i want in some cases. how dare i encroach on security!!!!!!

99% of the time you don't need shit to be secure. you won't care. but i find that it's increasingly like those public cctv and security cameras you see everywhere: it's suffocating, and critically, it also kills off every alternative, old client out there.

like, great, yeah. you require tls 1.3, and the latest certs to be installed. anything that's too old to get an update now? dead. there's nothing you can do. an end user is completely unable to do anything about these issues outside of buying a new device.

you have something retro you want to get online? too bad. have fun setting up a bunch of weird proxies to get around things. you want to download old software? guess what: it's often also hosted on these sites with higher requirements. you can't download k-meleon on older systems, a browser that tries to support newer encryption protocols specifically for older shit, because... the download system requires the modern ciphers and refuses anything else.

i think the most damning thing of all is that, as much as Google is leading the charge in enforcing these, showing scary NOT SECURE!!!!!!! if you dare to use http for anything... their shit still works with it. i'm pretty sure if you dump google.com in windows 98 internet explorer, it will still dutifully load an old search page, that, critically, still works.

my take on it is just: do you need it? do you really need five layers of web security for every single operation you do? i'm not saying it all should go away; banks and other websites that take personal information shouldn't be insecure. but the vast majority of the web doesn't need this. your geocities-aesthetic page does not need the finest encryption the nsa can provide. 99.999999% of the time nobody is going to give a shit.

but the fact that it's on, with no option to ever turn it off, means that you have no option but to upgrade to the latest and greatest. if you have something old, it could still be fully working; but they swapped the locks on you, so you can't use it any more.

disclaimer

i can rant about these things precisely because i have no impact on them. nobody is going to read this and turn around to go "wow, we should turn https off entirely!" because i ranted about it some. if you show up and go "wow so you just want everyone to get MITMed and hacked forever, huh" i will kick you in the nuts or nuts-equivalent and push you down a flight of stairs.

if you do this you are showing up to the old guy with a waist-length beard holding a cardboard sign saying "OLD WAS BETTER" and trying to argue with them, and i will instead beat you with the sign. let me have my fun. you are never going to feel the impacts of my rants, because they don't exist. but you might feel what i'm ranting about.


internet-janitor
@internet-janitor

There's an especially insidious wrinkle to this in web APIs. Many newer web APIs can only be used for documents served via a "secure context", which means served via HTTPS. Even for completely static single-file applications that never so much as make a single network request. Despite the claims on that MDN page, some browsers don't even allow these features to be used on documents opened from a local filesystem, and you can expect this to get locked down tighter over time.

Web-Decker will probably never be able to prompt the user to take a webcam photo, or access gamepads on firefox, or save a file in-place, because in addition to quite reasonable affirmative-consent-gating dialog boxes, there is this bullshit HTTPS constraint.

Secure Contexts are a very deliberate choice to ratchet applications toward HTTPS, and HTTPS is in turn a ratchet to kill old software.


asg
@asg

I'm not here to argue that requiring HTTPS everywhere doesn't make things hard for old sites. It does.

But all this talk of "you don't need security all the time!" misses the point that HTTPS also provides privacy. Third parties can't see what pages you're visiting, or the contents of those pages.

Sure, maybe you're not worried about getting owned by someone MITMing your connection to a retroware site. But do you really not care about your data getting hoovered up by every single multinational megaconglomerate capital overlord too?


xkeeper
@xkeeper

https is fine. https has been around for over 20 years at this point.

but i'd like to make a few counterpoints:

  • the main thing is the relatively recent requirement to nuke old protocols and only enable the newest, most secure ones. this is the problem. old clients simply do not have a way to communicate. there is no fallback. there is no "okay, we can use this less secure setup".
  • "best practices" also means you can't use http, at all, even if you can't use https.
  • part of this is because "but then an attacker could just pretend they don't support anything newer" and like, sure, whatever, valid.

the "but do you really not care about your data getting hoovered up": every website on the fucking planet is doing this to you already. even this website has an analytic script at scripts.‌simpleanalyticscdn.‌com! no megaconclomerate is going to sit there decrypting your tls 1.0 traffic to oldshittydosgames.‌com.‌gz when it would be much faster to nab it where everyone is willingly logging their traffic.

let me rephrase my original point a little more:

forcing all clients and servers to only offer the latest ciphers and reject clients that cannot use older ones effectively disables every client older than a few years. sure hope those programs are still getting updated.

and, again: as much as google pushes it? their own flagship website doesn't do this. you can still go there with plain ol' http if you want.


You must log in to comment.

in reply to @xkeeper's post:

There's an especially insidious wrinkle to this in web APIs. Many newer web APIs can only be used for documents served via a "secure context", which means served via HTTPS. Even for completely static single-file applications that never so much as make a single network request. Despite the claims on that MDN page, some browsers don't even allow these features to be used on documents opened from a local filesystem, and you can expect this to get locked down tighter over time.

Web-Decker will probably never be able to prompt the user to take a webcam photo, or access gamepads on firefox, or save a file in-place, because in addition to quite reasonable affirmative-consent-gating dialog boxes, there is this bullshit HTTPS constraint.

Secure Contexts are a very deliberate choice to ratchet applications toward HTTPS, and HTTPS is in turn a ratchet to kill old software.

i think part of the motivation for this is that lots of new web platform features are too dangerous to allow over regular http. so instead of leaving dangerous features out of the platform we just make sure everyone has to go https or go fuck themselves

Yeah, a friend once gave me the suggestion to configure my own site to only allow HTTPS for security, which felt silly when I don't even expect it to get enough attention to be targeted by an attacker, and would go against the whole point of the site which was to have it be fully viewable under Internet Explorer 6

I still have HTTPS of course, and anyone who's serious about security will set their browser to HTTPS-by-default anyway, but there's no point taking the option of unencrypted HTTP away from my site and breaking that backwards compatibility just to provide an entirely hypothetical level of security for a small handful of people who choose to view it this way on a modern browser

I feel this in my bones. There have been so many legacy devices in environments that I worked in required me to maintain a laptop or vm or something with an ancient browser version because modern browsers just flat out refuse to connect to them.

I am purposeful about allowing people to access my site via both http and https, and only forcing https wherever auth is concerned. Because there's a lot of shit that breaks when only https is allowed, such as transparent caching proxies (which are important in areas with poor bandwidth/access).

I used to have it such that my site would serve up entirely different sites on http vs. https, and when I worked at a search index company a few years ago I made a point to bring that up whenever anyone was assuming that http and https URLs are equivalent (our index didn't even track the scheme of URLs! it was maddening!) and like, yeah, it's an edge case, but edge cases are what make things break.

that’s an example of “where auth is concerned” though. like yeah if there’s a login cookie, then you need https. if the cookie doesn’t involve auth, though, why do you care if it gets stolen.

once I was using a 10 years old laptop and the rtc battery died, leaving it to a date if 1/1/2000. My browser freaked out when I tried to access google because the certificate was far into the future. It already used https pinning so I couldn't use http. I had a lot of trouble searching how to adjust the clock on linux that day.

what we have set up on ahti.space, sortix.org, etc is to redirect to HTTPS if your browser sends the Upgrade-Insecure-Requests header (sent by default on modern browsers) but still allow access over HTTP otherwise

I don't understand why that's not more common – it doesn't introduce any new attacks for modern browsers (if you are in the position to strip off the Upgrade-Insecure-Requests, you are in the position to just proxy the entire request), still automatically upgrades the connection to HTTPS for modern browsers that default to plaintext HTTP when you don't include the protocol, and allows me to access it on any random retrocomputer I have. I guess once TLS 1.4 starts becoming required we might need to rethink the setup to allow TLS 1.2 / 1.3 only systems that send the header to connect, but that it not a lot of maintenance burden at all

Google is shit.
I remember reading an article about problems caused by them lobbying to require https everywhere (around time that HTTP2 / SPDY etc. was being pushed and before i first heard of Let's Encrypt as a tool) especially in third world countries. Missionaries in schools on far end of the globe with access to the Internet only via satellite had extra trouble getting materials (even as basic as opening some wikipedia page) due to extra packets going through high error-rate connection. They had some caching put up in place so they could revisit previously opened pages without dealing with their satnet limitations, but it only worked for non-secure http.

And now i tried to look for that article again. Of course google's search focused on the fact i want to learn about satnet, or missionaries, or socio-economic problems in Africa, completely ignoring that i put HTTPS there as the first word in my damn query.
It effectively buried the thing under its "i know better what you'd want to read, buddy".

in reply to @internet-janitor's post:

I remember an article someone wrote about google's push to require https everywhere and how that would cut off huge swathes of the old pre-1998 internet that were still happily being hosted on the same technologies they'd always been and would not be inaccessible because instead of being basically free to host they'd now need to pay however much each year for a cert that has to be renewed. and how this was part of their strategy to kill off the open internet.

I... look, I'm biased (see bio). But getting a cert from Let's Encrypt is free (as in beer and as in speech) and all of the clients automate it for you so you can set and forget. Could it go wrong and fail? Sure, but so could the ancient Apache webserver or whatever is actually serving that old site.

it's easy to say "could" go wrong, when i'm very much an example of how it goes wrong. if you aren't also constantly updating and maintaining that server, your method of automatically generating that certificate can and will be deprecated

this happened to tcrf; we had been running on debian 6 well past its end-of-life date, under the philosophy don't fix what isn't broken. let's encrypt decided to stop supporting that os. and, great. except everything else still worked. letsencrypt decided that they weren't going to support that os any more, and the client simply stopped working.

it wasn't that the os was fundamentally not usable, or that there was some other problem. i ended up scrambling to manually renew it and eventually set up an automated client written in bash to handle it instead

all of this because certificates for websites constantly need to be renewed

and even then, everyone is now dependent on a central authority to validate and generate their certificates. yeah, great, letsencrypt is free. but there wasn't always a free ssl provider, and it leaves a massive, centralized point of failure for uncountable domains

so, let's encrypt adds a every-three-months rotating time bomb to your server setup, a server setup that otherwise has run on a protocol that has been around for literally two decades at this point, http/1.1

it's fucking bullshit. there's no nice way to put it. we've fucked everything up, pointlessly.

rapid releases on everything until things just fucking vanish out from under you and then you're fucked because nothing's ever hardened anymore so you gotta migrate off whatever the fuck you had the misfortune to depend on

I'd legitimately like to know more about this.

For one, Let's Encrypt doesn't support operating systems, we just expose an API that has been standardized at the IETF since 2017. Individual Let's Encrypt clients run on local machines and therefore have to support operating systems. Since you're not differentiating, I assume you were using Certbot? I'd love to get more details about how the client simply stopped working. As far as I'm aware, you should be able to install an old version of any client and have it just keep working. The only exception I can think of is if your client was from before the protocol was standardized, and only supported the v1 pre-standardization draft protocol which was finally deprecated and turned off in 2021 IIRC. (Nota bene: I do generally promote that people allow their clients to auto-update, so that they can get access to new features which are added to the protocol, but the base protocol should remain backwards-compatible in perpetuity now.) Can you help me understand what the failure mode was? This isn't supposed to happen.

But mostly I'd like to dig into the assertion that the webserver running on Debian 6 was "still working". Was it really? That OS (and presumably the server software you were running in it) is from 2011. So by not updating, you were remaining vulnerable to Heartbleed, Shellshock, and many smaller attacks. Things that can turn your server into a crypto miner, or a DoS node. Does that seriously not concern you? Keeping things up-to-date is an important security practice regardless of the content of the site.

Yes, it sucks that we can't just set-and-forget things. Yes, it sucks that old websites are driven off the web because their authors incorrectly thought that they could. But it's not just SSL that's driving them off the web -- they need to stay updated for myriad reasons, and once a site maintainer realizes that updates are necessary, then SSL is free and easy.

yes, it was certbot (forgive me for conflating the "this is the standard tool everyone uses to get this service that is pushed by the service itself"), and i'm not exactly sure what broke because certbot helpfully auto-updates itself as the first thing it does, and it decided to no longer work for whatever reason. i'm sorry i can't pull out the exact reasons that it decided it was done being useful because that was something like 4 years ago at this point.

So by not updating, you were remaining vulnerable to Heartbleed, Shellshock, and many smaller attacks. Things that can turn your server into a crypto miner, or a DoS node.

for as easy and trivial as all of this seemed, we ran for a long, long, long time without any actual problems

and at the same time, i'm not sure how you can assert that all of these things must be problems, because of the sheer number of old sites that are clearly working without turning into paste. if it was so easy to go bing bong you have a crypto miner now because your shit's old and exploitable, i'm pretty sure we would see a lot more of it, especially around older sites that are surely running on even older hardware

but really i just find it incredulous to walk up to someone who had a system running well for years and immediately express doubt: are you sure your car is running well? it's 10 years old, surely that means something has to have broken!

Totally fair, tons of people conflate Certbot and Let's Encrypt. I'm sorry that Certbot decided to stop working on your OS; I'm glad that you found (I'm guessing) acme.sh, which is designed for minimal and portable installs, exactly as you needed.

I think it's less like saying "are you sure your car is running well?" (although yes, those are the words I used, I apologize) and more like saying "are you sure you want to still actually drive that car, despite not having any airbags never having been subjected to modern crash tests?". Like yes, it still runs, but the people who have designed newer cars made them the way they are with good reason. And by continuing to use an old one you're not just living with fewer safeguards, you're actively putting yourself at greater risk because your car will lose the fight in a head-on collision with a new car. Okay maybe the metaphor has gone too far.

if I have a website hosted on a box in my closet that hasn't been touched or even rebooted since 1996, which requires no maintenance and costs me nothing, and I suddenly have to figure out how to get it working on the modern web, I'll likely just unplug the box
this is the case for many, many websites
even the Space Jam website moved to a new server in 2021 after 25 years of being untouched

in reply to @xkeeper's post:

someone disliking an aspect of something doesn't mean they think it's some sort of nefarious conspiracy.
do, do you think every little inconvenience you encounter in life is some sort of nefarious conspiracy? are you projecting an inability to view the world any other way onto others? like, what is going on here?

google literally put a big "NOT SECURE!!!" warning in their browser for every http page back in 2018 specifically to push disabling http entirely and force everything onto https yes, i'm not really sure why you are acting like this

they have also outright declared that certain tlds can flatly never, ever support non-https traffic, like their .dev tld, so again, yes, they are in fact pushing an https-only world while not adhering to it on their own damn site, yes

i mean i guess showing up and immediately pulling out the WoW iM vErY SeRiOuS lOoK aT Me tYpInG FuNnY you're not interested in actually thinking about anything, you're just here to be an asshole, and quite frankly that's my job and i don't feel like competing