ticky

im in ur web site

  • she/her

web dracat

made:
internet-ti.me, @Watch, Wayback Classic, etc.

avatars appearing:

in 2D by nox lucent
in 3D by Zcythe

"If it were me, I'd have [changed] her design to make [her species] more visually clear" - some internet rando

I post embeds of other peoples' things at @ticky-reposts



shoutout to @clip for prompting me to update this script - I use it to convert my VRChat photos from very large 3840×2880 PNGs into much smaller 3840×2880 HEIF files (which to my eye are nigh indistinguishable in quality), while preserving their original creation date and time, to import into Apple Photos

this update takes it from processing a set of 69 PNGs (nice) totalling about 650 MB in 19 seconds sequentially to in parallel on my machine!

I originally spent a bit of time trying to work out how to make GNU Parallel call a shell function, but I realised I could just… make the script call itself via GNU Parallel!

plus it still works if you don't have parallel, it just does it the old way. going to have to see what other batch operations I can upgrade like this… 🤔


You must log in to comment.

in reply to @ticky's post:

PLEASE CITE THE USE OF THIS PROGRAM IN YOUR ACADEMIC PAPER IT COSTS YOU NOTHING BUT BENEFITS US GREATLY

PLEASE TELL US YOU WILL CITE IF YOU DON'T CITE WE WILL STEAL YOUR COMPUTER

yeah I've done this a few times a few different ways, ranging from parallel to xargs to... forking background jobs and communicating with the parent via FIFO (used bash coprocesses for this but you can do it other ways too if you want to be slightly more POSIX-ey)

I like xargs better for trivial batch processing cases. It does a lot less than parallel and is already installed on everything and you don't have to sign over your thesis/firstborn/cat to GNU in order to use it. Things like sort and find et al. already have a mode that outputs null record separators, which xargs can parse, so you can even be whitespace-safe with it trivially easy.

The bash native cases were all instances of adapting an existing script to parallelize a bottleneck in a way that doesn't lend itself to an external process manager like then other two - eg. when the thing is a bash script function, or needs to feed back into the parent script in a way that isn't trivial. Every single one of these cases was cursed and I would never recommend it (but if you're curious, the parallel package usually ships with sem which allows you to use semaphores within your own scripts from a command line interface in case you want to do some really messed up stuff)

what're the space savings like with HEIF in your use? i've been running mine through oxipng nightly but that only tends to net around 30% space savings

(i also don't use the focus options in game option, which murders any sort of png compression, but that's less a space thing more i never think to |3)

for my purposes I find the JPG output from VRChat a bit too rough, but I like taking 4K shots, and with how little I can tell the difference between the PNG and the HEIF I’m happy to use that as my default

the other bonus is it’s much quicker than even the venerable oxipng!

yeah i haven't used the JPG output on vrchat's camera in some time, also taking pictures in 4k. it makes sense it's a little crunchy too, even with off-thread file saving it's needing to compress an image while all the rest of the game is running

i can't recall the timings on oxipng though, since i have it run in a nightly batch when i'm not using the computer. time shifting computing |3

gonna explore things for sure, there's a whole spaghetti mess of bash scripts i'm using that back things up to different places and i'm sure i can put in something to convert much older images, especially if HEIF supports transparency so i don't need worry about the sprinkling in of cutout shots i've taken