Created Remembrance, Permanence, & THERA
The Kyou System continued hammering and stirring, sleepless, indefatigable, at work upon the machines they were making.


posts from @KyouSystem tagged #programming

also: #software development, #coding

I've been sinking dozens upon dozens of hours over the past few weeks into converting my rewrite of the GNU Backgammon evaluation code from C into JavaScript.

It's all working like a charm so far, and I can (and will!) make good use of it in my own projects, to be sure, but still, I wonder...

Were I to publish it all as a FOSS standalone library with a very permissive license on GitHub or some such, would anyone actually care? That is, would it see any use by anyone other than me, or is backgammon so niche that even the most well-documented, easy-to-use, and liberally licensed web-compatible engine would fly completely under the radar?

I mean, since I'm already making it, I might as well release it, but sometimes it feels a bit silly, to have put forth this Herculean effort to rewrite this entire engine two times over purely to add features to my games that basically no one other than me will think twice about, heh.

JavaScript is annoying, by the way. You'd think that, with me being a C programmer, I'd be fine with flying fast and loose with my data types, but JS takes it to a whole new level...



So, all of the code changes I was planning to make to GNU Backgammon are finished…

…but right as I was looking into submitting a patch, I found the code used to generate and train the neural nets used to evaluate the positions.

See, I'd been laboring under the impression that the lead maintainer for the project had some specialized setup for training the nets, and that I'd have to defer to them because of that, but no. Apparently, there's just a program in a separate repository that you can build and run that'll train them for you. I've no idea how long it'd take to run on my hardware, but it's there all the same.

Between finding that code, and seeing on the bug tracker that it took the GNUBG maintainers some 3 months to review and merge in a different code patch someone had submitted that made 10 lines' worth of changes in a single file, my perspective on the situation has changed.

Perhaps I should simply train my own neural nets and release them with this evaluation code I've been slaving over as a standalone project, as a lean & mean backgammon library free from all of the jank and cruft of GNUBG that could be attached to any arbitrary front end. It'd all still be free and open source, of course, but it'd become proper Kyou machinery that way—and I was already planning on doing something similar with the Javascript re-implementation anyway…



KyouSystem
@KyouSystem

I've found and fixed two separate bugs in GNU Backgammon over the past two days, both located in the code used to calculate the inputs to the evaluation neural nets (i.e. very critical data).

They were small bugs, but fixing them made a visible change in the analysis output. I strongly suspect that, for certain difficult positions, these fixes might be enough to change what move the engine would score as the highest, even after a rollout.[1]

Do you want to know where those bugs were located in the original version of the code?

In the middle of a rat's nest of a function some 700 lines long; specifically, in obtuse sections of code completely bereft of comments and which almost exclusively used variables with names only 1 or 2 characters long.

Is it any wonder that subtle bugs like off-by-one errors or using the wrong one-character variable for an array index (aka the bugs in question here) were able to slip in here and remain completely undetected for God only knows how long?


[1] Of course, now I'm wondering if the neural nets for GNU Backgammon were trained while being fed these slightly erroneous inputs, or if these bugs crept in later. I very much hope it's the latter; I don't have the know-how or computing hardware at my disposal to train new nets myself.


KyouSystem
@KyouSystem

Here's one of the offending code sections, for your viewing pleasure, heh.


KyouSystem
@KyouSystem

While I'm at it, here's another gem: Apparently, the individual who trained the neural nets used by GNU Backgammon somehow managed to swap the sides of the board, such that the neural net thinks the player's board is actually the opponent's and vice versa. (See the attached image, lifted from the codebase before we made any changes.)

(Also, the astute C programmer will note the archaic use of scoping to declare new variables without having to put them at the top of the function. This codebase is full of old cruft like that; some of this code hasn't been touched in literal decades.)



I've found and fixed two separate bugs in GNU Backgammon over the past two days, both located in the code used to calculate the inputs to the evaluation neural nets (i.e. very critical data).

They were small bugs, but fixing them made a visible change in the analysis output. I strongly suspect that, for certain difficult positions, these fixes might be enough to change what move the engine would score as the highest, even after a rollout.[1]

Do you want to know where those bugs were located in the original version of the code?

In the middle of a rat's nest of a function some 700 lines long; specifically, in obtuse sections of code completely bereft of comments and which almost exclusively used variables with names only 1 or 2 characters long.

Is it any wonder that subtle bugs like off-by-one errors or using the wrong one-character variable for an array index (aka the bugs in question here) were able to slip in here and remain completely undetected for God only knows how long?


[1] Of course, now I'm wondering if the neural nets for GNU Backgammon were trained while being fed these slightly erroneous inputs, or if these bugs crept in later. I very much hope it's the latter; I don't have the know-how or computing hardware at my disposal to train new nets myself.