Question going around twitter: whatâs the most useless piece of video game knowledge you know? (youâre getting more content than twitter is:)
-
There is a fully functional Arwing from Star Fox inside Ocarina of Time. It was a debug test for the z-targeting system.
-
Majoraâs Mask has calendar support for more than three days and on some days the moon is very far away.
-
The Imposter Professor Oak pokémon trading card was based on a scrapped plot line for Pokémon Gold/Silver (in fact they threw out the entire plot and world map and started over)
-
The Paper Mario engine supports having no boots equipped and will accordingly prevent you from using jump based attacks, though there is no way to get into this state without editing ram.
Oh! I've got one for this! And I was recently replaying this game to finish DLC that I hadn't completed, so it's still top of mind for me! But as with all of my stories, I've gotta build it up a bit. So here goes, and please know that the example isn't exactly the way things work, but it should be close enough to understand why the game designers made this choice.
(And with apologies to folks like EoN and Sabera who are going to absolutely school me if/when I get something wrong.)
TL;DR - In Outer Wilds, you are always at the center of the universe.
There are a lot of ways to store numbers in a computer. Integers (whole numbers) are easy; take a bunch of bits, use one of them to indicate a positive or negative value, and the rest to form a number. Instead of bits, let's use decimal digits, for the non-CS folks in the audience. So if I've got 8 digits, plus a sign indicator, then my range is roughly -99999999 to 99999999. But what if I want to store values with a decimal point, or really BIG numbers? I suppose I could keep adding digits, but that's going to get ridiculous after a while; is there another way to do this, but without taking up any more digits?
Absolutely! The tool for this is something called a "floating-point" value. Instead of using all 8 like we have been, let's divide them up; six digits are in group "a", and the other two are in group "b". The "a" digits form a base value. The "b" digits though are used as an exponent. We're going to take the "a" digits, and multiply them by 10 to the power of the "b" digits. And because we want to be able to go really big or really small, we'll do an offset against the "b" digits, say subtracting 50 from their value. So to translate a series of digits "aaaaaabb" into their value, the math is something like:
(aaaaaa) x 10 ^ (bb - 50)
This is really useful! We can define numbers as small as a decimal with almost 50 zeroes and then our "a" value, or as large as our "a" value with 49 zeroes after it! That's an astonishingly massive range of values, without using any more data than we originally were for our integer!
But as is often the case, there's a trade-off, which comes in the form of precision. In this case, it comes in the form of granularity. We may be able to display a bunch of arbitrarily large or small numbers, but the larger the number we want to show, the larger the steps between each number we can display. Remember we've only got those 6 "a" digits to work with. If you want to show a million, that's fine (10000051). But because you only control the six biggest digits, you couldn't show a million and one; you'd have to jump from a million, to a million and ten (10000151). The larger your exponent gets, the larger the gaps between sequential smallest values, the more distance you have to jump between them.
How is this relevant to video games? Well, a lot of things that need to be known or calculated with arbitrarily high precision (say, the exaction position of the camera) are often stored as floating point values. In the game, everything exists as a position that's calculated relative to the "center" of the game space. If you get too far from this center, you start running into the limits of floating point granularity; your exponent gets too big, and the camera or other objects can no longer move in a way that looks smooth to the player. Most games will solve this by redefining the "center" on the fly as you move between areas, or as the game loads in new parts of the level. If you change where the "center" is, then your player doesn't ever get too far from it, because that "center" keeps moving.
Early in the develop of the game "Outer Wilds" though, this became something of an issue. The entire game is a simulation; the whole Hearthian system has to be active at all times (albeit to different levels of fidelity) for things to work the way they're supposed to. Changing the "center" on the fly was apparently causing issues since everything in the game is in motion relative to every other thing, so moving the "center" may have been causing hiccups as the game suddenly had to recalculate a bunch of positions and vectors and forces. But leaving the center in the middle of the system created other problems, because players visiting the outer-most reaches of the system would run into these floating point issues and start getting jitters.
So the developers implemented an ingenious solution - The player character is always at the same point relative to the "center", absolutely unmoving, and any time the player "moves", the game pushes everything else in the opposite direction to make it appear that the player character is moving. The planets, the sun, EVERYTHING. Any time you jump, your player character is completely stationary; instead, a force is applied to everything else in the goddamn game to push it away from you before gravity brings it all back. From a mathematical perspective, the calculations have the same result. And the processing overhead was minimal since they were already doing a bunch of physics calculations for the other objects in the game at all times.
I've always been a fan of clever hacks, and this one absolutely takes the cake for me. One of the key design philosophies of Outer Wilds is that you as the player aren't at the center of the world, but technically, the exact opposite is true.
