🔞No minors🔞

Voted "most likely to become a nothlit" by senior year class.

Manufacture date 1991

Nonhuman θΔ

My silly modded-Minecraft account is over at @worse-than-wolves.

❤️ @Ashto ❤️ @Yaodema ❤️ @Yuria ❤️


Fediverse / Mastodon
chitter.xyz/@gyro
Itaku (JUST made zhis one)
itaku.ee/profile/millielet
Pillowfort (Also just made)
www.pillowfort.social/Gyro

yaodema
@yaodema

and why they might be better off as 6 bits

wait I thought a byte was always 8 bits?

nope! a byte is whatever size a "character" is, at its smallest. on modern systems, this is pretty much always 8 bits, but for a good while, the most common byte size was instead 6 bits. I'll get into why, and why we swapped, as this post goes on.

so we used to use 6 bit bytes?

yes, in the earliest binary computers, and in many later designs until around the mid 1970s, that was the norm. the reason we swapped is partly the fault of the American Standards Association, but also partly the fault of Johannes Gutenberg, and the fault of some monks in the late 700s choosing a new, faster way to write Latin.


queerinmech
@queerinmech

another funny thing about this is that many home computers in the 1980s did not actually use proper ASCII despite it having been a standard almost two decades

this is (partially) due to the Signetics 2513 "character generator" chip from the 1970s used for text mode by many home computers which only supported 64 characters - all uppercase!

the 2513 chip was used by early Apple computers

the ZX80 and ZX81 likewise only supported uppercase, but this time due to implementing their own bespoke character set in ROM where every character took space away from the system software!

(also - most hardware used in the 1980s was based on chips from the 1970s which were being misused or cheaply manufactured descendants thereof, so ASCII was still relatively new when those chips were designed)