chimerror

I'm Kitty (and so can you!)

  • she/her

Just a leopard from Seattle who sometimes makes games when she remembers to.


Janet
@Janet

sorry, i can't even remember how to get back at the source... was angry at the twitter user originally posting it like that, not remembering whether i checked the alttext either... The original picture reads:

Imagine if I asked you to learn a programming language and a culture of practices
where:

- All the variables were a single letter, pathologically so, and that programmers
enjoyed using foreign alphabets, glyph variation and fonts to disambiguate their
code from meaningless giberish.

- None of the functions were documented independently, and instead the API
documentation consisted of circular references to other pieces of similar code,
often with the same names overloaded into multiple meanings, often impossible to
Google.

- None of the sample code could be run on a typical computer, in fact, most of it
was peudo-code lacking a clear definition of input and output, or even the
environment and domain in which it is typically expected to be run.

- Expressive code in this language was impossible to write in a typical text editor
or word processor, and transcribing code from paper back into something useful was
an error-prone process.

- Papers describing novel ideas in the field were incomprehensible by default, so
that the world's most established experts on the subject would need years to decide
whether a particularly cutting-edge program is meaningful or in fact insane
gibberish.

- These experts not only see nothing wrong with this picture, but in fact, revel in
symbolic obscurity and ego-driven naming, dismissing sincere attempts at fixing
these deep seated problems as meaningless taste arguments or pointless pop culture
entertainment.

If you were an experienced programmer, you would rightly call me insane.

If you were a mathematician, it would be a day ending in y.


now while copying that off, it got me thinking:

what is described here is not that foreign to me. it always looks rosy on the outside, the thought that everything was documented, but often when you find something you enjoy, it is you who starts improving the documentation. as it should be. if the documentation is lacking: fix it!

also, dear twittor, twittress? hm lets call you mathematicianado? right, M: why do you think there are so many programming languages? we use whatever we like and or is appropriate for the job. you can mix and match, like interacting with some sort of parser inside another application.. so i suppose it sucks, since you only got math to express math... and we got only all those programming languages to implement math and whatevs we wanna do. so we are both stuck with imperfect solutions.

dear M, your lament at the terseness of math brings to ming one question: why so terse?

because you are manipulating symbols on paper. on paper you dont want to write the full names of each of your symbols, for one it would break your parser, since hypothenuse can be parsed as h*y*p*o*t*h*e*n*u*s*e (real big brain example i know) but also, you are manipulating the symbols at a higher rate than you can write down the process, sometimes.

i've done math... im crap at it and happy when i poured it into concrete code. im not even kidding, i failed math one too many in university, so i switched to a three year apprenticeship. certification and 7years+ of sometimes kinda crap work (mostly due to documentation related issues (but i also had a far to broad job description for far to long))

i used to joke my job description was "Mädchen für Alles" ... :host-plead:

i didnt really get to try my implementation because of that job description but that is another story, but i was attempting to coax a motorcontroller+micrometertable to move a workpiece in a certain way in order to distribute laserpower very evenly...i am not sure if what i attempted was even possible with the hardware. i have no idea about the specs of the hardware in there xD i mean sure the electronics i get, but .. the mechanics just dont talk to me. i used excel sheets to model the behaviour i desired and then tried to instruct the microcontroller with the specced down version it might have understood... the math was fun. the microcontroller wanted to get the coefficients down to jerk to calculate an S-Curve. so that was fun. seriusly. i hadnt done this since university and finally i was back not just in applications programming but also math. numerische mathematik. use computers to simulate a mathematical model chefs kiss

but implementing them? ugh... refactoring a function or another? all the typing.. I/O overhead... handling errors... memory management... all fine and dandy but MATH? in C#?... i guess i might have used something else... mixed it in somehow? idk... the codebase was already pretty deep down pat. i think i improved it.

so, i get it. and i feel it. and i am glad lisp uses polish notation and parenthesis for structure to encode a tree as text. but i am sad that it feels like math but slow.


ireneista
@ireneista

this is a topic we have longstanding thoughts on, but the discussion up-thread is so thorough that we have nothing to really add to it other than this

please be deliberate about whether you are writing code, doing math, or both. don't allow practices optimized for one to bleed into the other without at least noticing that they are doing that and turning it into a conscious choice.


cgranade
@cgranade

I agree with much of the above, but I also want to share a bit as someone with a fairly extensive background in both mathematics and software development. In particular, this is a sentiment that I've seen a lot of, but I've seen comparatively few attempts to improve the situation on the math side of things.

In part, I would posit that's because math is extremely difficult to denote well — when you program, you do so against a precisely specified semantic model, or at least a documented and implied model. Math, on the other hand, revels in making those semantics part of the object under study. Every paper has its own set of axioms that it works from, its own formal system, as the thing being studied is in effect the consequences that those axioms have.

Math is also often extremely dense, with fairly complex equations appearing as single atomic facts (this is especially true during derivations, before eliminating and cancelling out various terms), such that patterns can be harder to see when using more verbose notation.

A third challenge is that often symbols appearing in mathematical notation have no independent meaning. They exist to describe relations that hold between arbitrary symbols. For example, in the equation 𝑥² − 1 = (𝑥 + 1)(𝑥 − 1), 𝑥 is completely and entirely meaningless. That's indeed the point, that the equation holds without respect to what particular meaning a specific assignment of 𝑥 has. If you try to give a more semantic name for 𝑥, it quickly becomes apparent that it is only a placeholder, defying any attempt to give it a logical name.

These are all, I would suggest, artifacts of the same underlying issue: mathematical notation is not globally optimal, but it is one hell of a local optima, achieved over literal centuries of effort. When proposing new notation, you're quite likely to trip over the reasons that mathematical notation is the way it is. That's not to say it can't be better, or that there aren't good efforts at doing so, or that the problems with mathematical notation don't cause massive problems in reinforcing phobias about math induced by inhumane approaches to "education." Far from it. It just means that mathematical notation is effectively built for a different purpose and solves different problems than most programming languages are intended for. As a result, it's difficult (not impossible or unimportant — difficult) to constructively apply lessons learned from programming languages to improve mathematical notation.

</soapbox>


You must log in to comment.

in reply to @hecker's post:

in reply to @ireneista's post:

also, i'm pretty sure the ratio of expository text to actual mathematical symbology in the typical math paper is much larger than the ratio of comments to code in any piece of software. :p

not QUITE true, but in general for sure

the literate programming tradition, invented by Knuth, is to have a book which is also a program, and you write them in the same file

the Inform 7 source code is an awesome example that you can go read on the web

also: math's symbolic nature only was pointed out to me by this text im quoting. i wasnt really aware of it being that way, i was just using it. reading about symbolic obscurity it made click and i thought: oh, just like in lisp where you have symbols that evaluate to values that might be functions too.. you can (in lack of a better word) code dive in lisp just the same way as in math but better because here symbols have sane names

in reply to @cgranade's post:

thank you all for the input, i have a strong interest in developing a symbolic editing environment, mainly for source code in vr but i could imagine to use it for anything symbolic and math is pretty close to what i already had in mind... but for now its bed time for me, again, thank you for the food for thought :host-love: