• he/him

Coder, pun perpetrator
Grumpiness elemental
Hyperbole abuser


Tools programmer
Writer-wannabe
Did translations once upon a time
I contain multitudes


(TurfsterNTE off Twitter)


Trans rights
Black lives matter


Be excellent to each other


UE4/5 Plugins on Itch
nte.itch.io/

gradients
@gradients

A few months ago @Halceon commented on a @lucasarts-places post in appreciation of how nice some of the gradients looked in those old VGA games. In a fugue of agreement I carefully isolated a few of them in a paint program, to sort of examine more deeply what I admired about them. The process of doing that made me think, hm I'll bet I could recreate those with code. And over the next few months I went down a fairly enjoyable rabbit hole of learning how to implement various color blends, gradient shapes, pattern dithering, and digging up hundreds of interesting color palettes and combinations from old games as well as modern pixel art.

The result is this bot. It'll post twice a day, once at sunrise (PST), and again at sunset. To keep the output fresh, it draws on several different gradient generation, color picking, and rendering methods and pulls from a large internal library of color palettes and "ramps" (runs of progressive color blends) - I'll explain more of the technical details in a followup post.

If the colors and blends of old 8-bit and 16-bit games and paint programs are burned into your brain, you might recognize some of what comes out of this code. If not, simply enjoy the colors. Thanks for following!


gradients
@gradients

One thing I should say up front is that this bot's code was written for the sole purpose of generating images for this bot; ie it's not some cool new general library for generating and drawing gradients. It also doesn't contain any exciting new graphics techniques - this was me methodically working out how to do very old stuff like Bayer matrix dithering without leaning on any existing library.

Where to start? I knew early on that I wanted to support multiple "shapes" of gradient, ie not just the classic vertical or horizontal test swatch. These are handled by different GradientRenderer classes, which take a Gradient object as input and return a Python Imaging Library Image. Generating the Gradient is where most of the action happens.

The decomposition, the basic unit, of a gradient I settled on was the Blend: a start color and an end color (RGB integer 3-tuples, ready for feeding to PIL), and a "size" that is some fraction of the gradient's normalized (0 to 1) length. A generated Gradient is a list of those Blends, along with the other information needed to render it (its shape, its native render size, and some post-processing settings like what kind of dither, if any, it should use). Most of the gradient generation process is building that list of blends: what colors they should use, and how they're sized and arranged.

A ton of things in the bot are weighted random, where I've defined tables for how often a given choice should come up. I centralized everything related to random chances and weightings in tuning.py.

The dither method used has a huge impact on a gradient's overall feel. Like I said, I implemented pattern dither from scratch, partly because PIL deprecated it1 and partly because I wanted to play in that aesthetic space, with different variations on it. Initially I couldn't wrap my head around the algorithm behind generating an ideal NxN Bayer matrix, so I made the code source dither patterns from image files, and later wrote code to generate different patterns in search of the ideal one - I kept most of those experiments around, and weighted them to be fairly rare, as an occasional bit of spice, before finally settling on a recreation of the 8x8 Bayer pattern that Deluxe Paint II used. I also occasionally let PIL use one of its own dither methods, based on Floyd-Steinberg, just to add more variation.

Color selection was obviously a huge topic and I knew I'd want to support multiple methods side-by-side, again chosen by weighted random selection for each gradient, which eventually became the ColorPicker class and its children. "Pure Random RGB" was the obvious and trivial first method to start with, but I knew I wanted part of this bot to be about "sampling" from the past, all the beautiful color palettes artists and hardware designers built decades ago (and continue to today!). Adapting the approach I used with Playscii, these palettes are stored as image files2, and a separate data file manually defines all the "ramps" in each image, which the blend generation process uses when choosing colors and how to move between them. Deciding which colors within a palette constituted which ramps was a subjective call on my part, and it became clear that this was a major determiner of how generated gradients using them turned out. I eventually realized that in a lot of cases, a palette only had one or two interesting ramps and the rest of it might as well have been junk, so I created a separate color picker method called "authored ramps" that draws from a big collection I gradually accumulated from a wide variety of sources. Compared to palettes which are kind of whole little worlds unto themselves, I realized I needed to make the ramp library a good curated selection with diverse looks, so I added an ability to my utility script for dumping out the ramps library to sort by average hue, so I could tell if I was neglecting any particular part of the spectrum and ensure my biases weren't turning the bot into "Oops, All Sunsets!".

I considered building a GUI3 to work with all this stuff at a few points in the project, but my setup already allowed me pretty rapid iteration: dual-pane text editor on the right, terminal on the left, and a feh window that live-updates when the rendered image changes. I knew that past a certain point reproducibility would be important, so I made each new generated gradient serialize out to JSON. A version of this data is what you'll see in the alt text for all the images the bot posts, and the JSON itself is also stuffed into each PNG's metadata - an interested tinkerer could use this data to recreate any gradient at a different resolution, play with different settings to produce variations on it, etc.

For the first time I decided to license this code, and all the other code in my newish "multiple bot projects" repo, under the Anti-Capitalist Software License. While technically not a true "open source" software license4, it's most in line with my principles and feels appropriate here - there's no good reason to keep this code closed, and maybe it'll help someone learn something down the line. I also licensed its image output under the CC BY-NC-SA license, on the very slim chance that the few remaining NFT bros out there come across it looking for new stuff to rip off.

While I'm calling the current state of the project "1.0", I'll absolutely never stop shopping around for cool new palettes and ramps and throwing them into the library so they'll show up in future posts. And there are definitely features I'd like to add someday5. But today I'm calling it good enough to launch. Enjoy! I'm happy to answer any questions. Oh and feel free to use the Asks feature to suggest particular color palettes or ramps etc.


  1. Or maybe it's just very hard to access now? I can't tell.

  2. Although unlike Playscii, this bot's library of palette images are stored as indexed color, because it makes the data for defining ramps and the code for turning those into colors simpler.

  3. Probably using the excellent imgui, powerhouse of programmer debug UIs.

  4. Deal with it, ancap FOSS bros.

  5. I was convinced I'd end up implementing Bezier interpolation for colors at some point, but it never quite managed to become pressing.


You must log in to comment.

in reply to @gradients's post:

in reply to @gradients's post:

palette suggestions:

thanks for the writeup! can you point me to a file to look at if I want to know more about the "all continuous blends" image?

Thanks for the recs! I'll add those links to my "color shopping" notes. The sharecart1000 palette is nice, I think I included that with Playscii.

Let's have a look at that particular gradient's data:
seed: 2024-08-22T23:14:26.563658
size: 256 x 256
shape: GS_VERTICAL
color pick method: COL_AUTHORED_PALETTE (pico8)
blends:
(255, 119, 168) -> (29, 43, 83), 0.5702 (ramp 4)
(29, 43, 83) -> (0, 135, 67), 0.4298 (ramp 3)
dither: DT_PATTERN (map: 8x8_dpaint2e)
flags: ALL_CONTINUOUS

So it's using the PICO-8 palette, and it's using two of the ramps I defined for that palette, which I got from this analysis of it here, the third image in this article: https://www.lexaloffle.com/bbs/?tid=3386
My comments in authored_palettes.py for those ramp definitions 4 and 3, respectively, are "pink + ROYGB" and "black to green thru blue".

"All Continuous" is a color pick "theme" of sorts that has a 20% chance to be rolled at the start of gradient generation. It just means that when generating every blend after the first, always use the previous blend's end color as this new blend's start color. This makes it so there aren't any of the hard breaks in color you'd see in gradients generated without this flag.

ahhh I was curious if you had some sort of smart palette-sorting algorithm, one that seemed like it gave great results on the pico8 palette. turns out it is a smart palette-sorting "algorithm" (human authorship)

there probably is some sort of super smart color math one could do to figure out what the interesting / natural ramps are within a given palette, but it was easy and pretty enjoyable to me to just open up each one and check the indices and manually input them.