As many people commentators have already pointed out, there isn't really an industry-accepted term for this phenomenon in game design. That's because the academics who study game design (ludologists) and those who practice game design for money (game designers) tend not to read each other's works. Luckily, there's always the field of economics to bail us out.
A few months before launch, The Witcher 3 had an interesting problem: It didn't have an economy. As described in this wonderful article from ArsTechnica, senior gameplay designer Matthew Steinke fixed this by first making a broad-strokes diagram to sketch out the relationships between items in a player's inventory, merchants, and loot containers. Then he used a form of statistical analysis called polynomial regression to set prices for items in the entire game. This algorithm worked so well that the team could remove hard-coded tables for item pricing and instead predictively generate new items on the fly based on his formulas. If you can make the time, I think it's very much worth your while to watch his GDC talk in its entirety on YouTube.
So that's a partial answer to the question, right? Scarcity is something that game designers must take into account when designing a game's economy. But because game designers are (usually) not trained economists, they tend to fuck it up in some way. In an imbalanced game economy at the mid-point, money either becomes worthless (inflation), or players are frustrated because they still cannot attain the best items (deflation). Preventing either situation is really, really difficult! Even when game companies do hire economists for their hat-based economies (who later become the Minister of Finance of Greece), that doesn't guarantee good results either.
If you want to learn more about this topic, the book Virtual Economies: Design and Analysis appears to get good reviews. (I haven't read it.)
