chimerror

I'm Kitty (and so can you!)

  • she/her

Just a leopard from Seattle who sometimes makes games when she remembers to.


Janet
@Janet

sorry, i can't even remember how to get back at the source... was angry at the twitter user originally posting it like that, not remembering whether i checked the alttext either... The original picture reads:

Imagine if I asked you to learn a programming language and a culture of practices
where:

- All the variables were a single letter, pathologically so, and that programmers
enjoyed using foreign alphabets, glyph variation and fonts to disambiguate their
code from meaningless giberish.

- None of the functions were documented independently, and instead the API
documentation consisted of circular references to other pieces of similar code,
often with the same names overloaded into multiple meanings, often impossible to
Google.

- None of the sample code could be run on a typical computer, in fact, most of it
was peudo-code lacking a clear definition of input and output, or even the
environment and domain in which it is typically expected to be run.

- Expressive code in this language was impossible to write in a typical text editor
or word processor, and transcribing code from paper back into something useful was
an error-prone process.

- Papers describing novel ideas in the field were incomprehensible by default, so
that the world's most established experts on the subject would need years to decide
whether a particularly cutting-edge program is meaningful or in fact insane
gibberish.

- These experts not only see nothing wrong with this picture, but in fact, revel in
symbolic obscurity and ego-driven naming, dismissing sincere attempts at fixing
these deep seated problems as meaningless taste arguments or pointless pop culture
entertainment.

If you were an experienced programmer, you would rightly call me insane.

If you were a mathematician, it would be a day ending in y.


now while copying that off, it got me thinking:

what is described here is not that foreign to me. it always looks rosy on the outside, the thought that everything was documented, but often when you find something you enjoy, it is you who starts improving the documentation. as it should be. if the documentation is lacking: fix it!

also, dear twittor, twittress? hm lets call you mathematicianado? right, M: why do you think there are so many programming languages? we use whatever we like and or is appropriate for the job. you can mix and match, like interacting with some sort of parser inside another application.. so i suppose it sucks, since you only got math to express math... and we got only all those programming languages to implement math and whatevs we wanna do. so we are both stuck with imperfect solutions.

dear M, your lament at the terseness of math brings to ming one question: why so terse?

because you are manipulating symbols on paper. on paper you dont want to write the full names of each of your symbols, for one it would break your parser, since hypothenuse can be parsed as h*y*p*o*t*h*e*n*u*s*e (real big brain example i know) but also, you are manipulating the symbols at a higher rate than you can write down the process, sometimes.

i've done math... im crap at it and happy when i poured it into concrete code. im not even kidding, i failed math one too many in university, so i switched to a three year apprenticeship. certification and 7years+ of sometimes kinda crap work (mostly due to documentation related issues (but i also had a far to broad job description for far to long))

i used to joke my job description was "Mädchen für Alles" ... :host-plead:

i didnt really get to try my implementation because of that job description but that is another story, but i was attempting to coax a motorcontroller+micrometertable to move a workpiece in a certain way in order to distribute laserpower very evenly...i am not sure if what i attempted was even possible with the hardware. i have no idea about the specs of the hardware in there xD i mean sure the electronics i get, but .. the mechanics just dont talk to me. i used excel sheets to model the behaviour i desired and then tried to instruct the microcontroller with the specced down version it might have understood... the math was fun. the microcontroller wanted to get the coefficients down to jerk to calculate an S-Curve. so that was fun. seriusly. i hadnt done this since university and finally i was back not just in applications programming but also math. numerische mathematik. use computers to simulate a mathematical model chefs kiss

but implementing them? ugh... refactoring a function or another? all the typing.. I/O overhead... handling errors... memory management... all fine and dandy but MATH? in C#?... i guess i might have used something else... mixed it in somehow? idk... the codebase was already pretty deep down pat. i think i improved it.

so, i get it. and i feel it. and i am glad lisp uses polish notation and parenthesis for structure to encode a tree as text. but i am sad that it feels like math but slow.


You must log in to comment.

in reply to @hecker's post: