Ex-academic, current tech monkey by day & speedrunner by night


bluesky
leggystarscream.bsky.social
discord
@leggystarscream

tomforsyth
@tomforsyth

(this is a rant I had on Mastodon, slightly cleaned up, but still a bit scatterbrained)

Thinking about when people talk about game engines making things more "even" between 3-person indie groups and AAA titles. But it's kinda bullshit. And also irrelevant. Lemme splain.


vectorpoem
@vectorpoem

a few different non-mutually-exclusive notions of how to quantify a game's value in the mix, here. (these are quantitative lenses, offered with full acknowledgment that an artistic medium isn't ultimately made of numbers like this.)

content quantity - how many unique authored pixels/texels, frames of animation, units of accessible world space, seconds of cinematics and recorded audio, goals, interactive permutations, etc are there on the (literal or figurative) disc? how do the systems serve to condense and/or extend this content? games that lean hard on procedural generation systems can obviously be a multiplier on this but even something like random monster spawns or loot tables have tiny effects along this axis. a good engine can ease many developer burdens that limit how much content devs can ship in a game - the same team in parallel universes A and B trying to make the same game with two different engines X and Y - where X is better at, let's call it "content wrangling", than Y - will see the game from universe A ship with more content packed in.

content fidelity - how much unique authored raw information makes in onto the final rendered frame / audio buffer, rendered to its output resolution + framerate? this was the arms race that games became obsessed with around the turn of the 1990s and rode up until around a decade ago, after which a point of noticeable diminishing returns effect kicked in. (this is not to say that high end games aren't still pushing fidelity in 2023 and won't continue for the foreseeable future, going for eg 4K everything with no upsampling, even supersampling from massive textures and 8-digit polycount source meshes, etc. but i think it's safe to say that compared to 20-25 years ago, pure fidelity is no longer the force driving the economics and processes of game development.) by most definitions, a good game engine will be able to throw more pixels and polygons at the screen, and the user's perception of the content's fidelity will be higher. pretty straightforward here.

content quality - the devil of subjectivity enters: how nice does any given bit of content / output look, sound, play? obviously a skilled artist and a novice artist will be able to achieve very noticeably different results with the exact same target resolution. quake 1's textures look gorgeous compared to those of most of its contemporaries. this is a complicated argument that i should probably write an entire book about someday, but good tools allow devs to arrive at higher quality results, more quickly, which allows them to not only do a better job with each piece of content but also to try more things. it's not just about the commonly cited "lowering iteration times" - better tools can fundamentally transform your understanding of what you're making, allow you to "see further", make better decisions and become a wiser, more powerful creator. amazing quality results can and have been achieved even with godawful tools - but hop in your time machine and give those same folks better tools, and you'll get a better game.

labor intensivity - how many person-hours of work went into the game? how many people are on the project? how many hours do they work? what is their quality of life? how efficient do the tools allow them to be? if it sounds weird to speak of these as a major metric of game quality, ask a game dev how much worse a job they do after 20 hours awake + 7-day weeks vs fully rested and unstressed. a good engine can help ensure that each moment devs spend working on a game results in a corresponding quality improvement. and on the whole, game engines and tools have inarguably gotten better at this in the last 20-30 years. we love retro games but nobody still wants to build anything in 3D Studio MAX r1 (1996).

capital intensivity - how much money was spent on the labor and content that resulted in the finished product? this is the most complex equation of all, but it's at the dark heart of the entire games industry (which, it must be said in 100ft flaming letters, is not the same thing as the creative medium of games). when you look at an E3 briefing, you are witnessing a capital expenditure that has been optimized to maximize return on investment.

We had a derived stat from Mass Effect that the average user could consume about $170,000 worth of level art per minute.

because capital sits upstream from labor in the process, everything that's true of the relationship between game engines and labor-intensivity is also true of capital-intensivity, with the addition of being able to brag that your platform or game engine is more powerful than your competitors'. and lo, the battlefields of the console wars will run red with slurs, as ever they have.

so obviously, to the OP's point, big budget teams have advantages in almost all of these areas. compared to a typical indie team they can create more content, at a higher fidelity, and throw way more money and thus people at a production. of course, they also pay higher coordination costs, with ubisoft's multinational 1000s-strong projects posing massive logistical and production challenges at the extreme end of the spectrum. broadly, big teams have less agility, less ability to experiment, and can take far fewer risks. 10-15 years ago folks working in what became the "indie games industry" were able to find new niches, revive old ones, and the costs of doing so weren't the deterrent they are for a big corporation. this dynamic has been fundamental to cultural production so long as the concept has existed, and will continue indefinitely.

what i think has changed dramatically in more recent eras is the ways in which tools (including engines) allow smaller teams to "punch above their weight", producing higher quantities of higher quality content with less labor (and thus less capital)1. as an example, i'd argue that id software's rapid rise in the early 90s was due in large part to this commitment to tools - the codebase they shipped Commander Keen 1: Marooned on Mars with allowed them to make tons of other games quickly, increase their skills, try different genres, and get more creatively ambitious. they carried this through to Wolf3D and Doom, using cutting edge NeXT workstations to quickly design 3D levels when most devs were still plodding along making things in 2D tile editors. this kicked the already hot tech and tools races of the 90s into a higher gear, and ultimately once big publishers learned how to harness this dynamic we got modern industrial-scale game production in the 00s, in all its grandeur and horror.

but to be clear i agree with the OP's broad assertions: there is no real "leveling the playing field" tool or tech or process that will let a team of 5 people produce the same work as a team of 50 or 100 people. even if every dev had access to the every tool, small productions have too many inherently differences from large ones. i don't think that invalidates any of the amazing things that better tools can do, and i think general awareness of this is only growing with time, as AAA scrapes the stratospheres of capital-intensivity (often only sustainable with massive consolidation! me reaping: hell yes, etc) and the PS5 gen's hardware architecture resembles a planet-dismantling mass driver for putting money on screen, and more and more of the audience deprograms and steps off the 90s-00s train of thought that says that those pure quantitatives - "more content = better game, more fidelity/money = better game" - determine what they'll find compelling. instead, we'll hopefully continue to have a wider range of folks, making a wider range of stuff, for a wider range of folks, and having a better time doing it.

1 it's also true that lots of design trends of the past decade, from permadeath to procedural worlds to user-created content, have been about wringing a better quality / more valuable game out of the same amount of content - but that's more in the realm of design and creative constraints than tools or technology.


johnnemann
@johnnemann

There are three reasons, imo, this is true. Number one is team size. AAA teams these days consist of hundreds of employees, plus large teams of external contractors, engaged together in a creative process! This shapes the end result far more than any tool can compensate for.

Number two is financing. AAA companies have millions of billions of dollars on the line simply by existing. These dollars come from publishers and investors, large institutions with immense power and control.

And the result of both of those is a process and culture that will, sadly, never be able to equal the heights of indie game development. No matter the tools used, the combination of financial incentives, external pressure to hit a mass audience, and the bare difficulty of executing on any sort of creatively-challenging work among hundreds of people will prevent AAA from meeting the quality bar set by small indie studios who are creatively and financially flexible enough to make bold decisions and experiment at the edges of the medium.

It is really inspiring to see big companies and studios that are doing so much better now that they have some access to the real strengths of the medium (queer developers and niche audiences), though! I hope they continue to push themselves! You can do it!! :-)


You must log in to comment.

in reply to @tomforsyth's post:

"indie games still look indie"

This reminds me of that primal, societal fear that Winston tries to threaten O'Brien with in 1984: "Sooner or later they will see you for what you are, and then they will tear you to pieces." That feeling that the first ape who was cognizant that they were stealing from another ape felt, knowing that they were fundamentally lying and cheating someone else and they had to hide that or get torn to pieces.

I dearly wish that I could cite whoever planted this notion in my head, but what I'm getting at is that the "acceptance" of indie games feels like it's always been kind of correlated with how "non-indie" those indie games looked. Cave Story looked exactly like a SNES game. Super Meat Boy did unique things with vector-like shapes and giant, zoomed-out screens that hid its low pixel/texture fidelity. Minecraft did a similar thing by trading model and texture quality for scale. Alien Hominid, World of Goo and Braid were carried on their unique 2D artwork design that felt like it was designed that way, instead of "we aren't skilled enough/able to afford to do something else." I think that's the crux of it - that indie games were accepted only when they didn't give the impression that they only looked the way they did due to incompetence or lack of funds, but on PURPOSE.

You can see this by comparing the relative lack of success that engine-made games had in the 90s and early 2000s. 2D games made in RSD Game-Maker and Clickteam's products (and to a lesser extent, Game Maker 4-6) had this obvious jank in physics and gameplay that, even if they were sold, often didn't result in any noteworthy numbers. This lack of funds - or the incompetence of being unable to sprite or compose - often led them to use graphics and music from existing games, such as Sonic, Mario, and Mega Man. Even if you had the capability, or at least the willingness to learn to make your own, it was very hard to dig up the intrinsic motivation to do so when the underlying programming would still make your game feel crude and "indie-like" unless you had the programming chops behind your game that Cave Story did. Otherwise, consumers would "see you for what you were," knowing that they deserved better games to ask them to spend their time with. The same happened with interactive fiction's attempted resurrection in the 90s by Adventions and Cascade Mountain Publishing. Even though good engines such as TADS and Inform had finally come to make writing your own Infocom-quality - and beyond, in my opinion - interactive fiction, nobody wanted a "crude" graphicless, soundless game that was interacted with via typing. Although I disagree that this makes the genre invalid, that counted as "jank" for people. The general public felt that they deserved better and so these games sold in small numbers. I'll get to games "being as good as their forefathers but no one cared" later.

This funding model continued with the rise of compensated Flash games, where games, formerly given away for free (because again, why all that effort) around 2005, were usually not sold but rather given away with an ad-supported model as seen in the MochiAds model and with success in, for example, the Papa Louie games. Papa Louie didn't try to sell itself to you as an equal. It didn't even try. It just said "okay, you'll get a free game with addicting restaurant-management gameplay and in return you can watch these ads." It gave you the game before asking you to pay for it, unlike AAA games. It didn't even pretend that it could compete at that level (This type of shareware/time-trial was also seen in Apogee/ID's works in the 90s and casual games like Diner Dash in the aughts.)

As the Digital Antiquarian, Jimmy Maher, says on the commercial death of interactive fiction and point and click adventures, there seemed to be this sense among a lot of gamers that the current genres in the artform weren't valid in and of themselves, but merely as stopgaps until computers advanced enough to give you the "optimal version" of what the current games merely asked you to imagine you were playing.

Zork didn't have graphics, but it would have to do.
Wizardry and Ultima didn't have real-time gameplay, but it would have to do.
2D Sonic and Zelda didn't have 3D graphics, but it would have to do.

You can see that these games' sequels tried to reinvent themselves to the public later with all those things they "lacked," right? Return to Zork had graphics. Wizardry 8 and Ultimas 7-9 had real-time gameplay. Four years after Link's Awakening, Zelda was "finally" 3D, and the same goes for Sonic Adventure four years after Sonic & Knuckles.

I understand that I set up two seemingly-competing ideas - people being unable to program games, so they felt janky, and nobody wanted to pay for them, and then finally getting good engines that made games that felt just as good to play as the 80s/90s classics, but they still looked cheap, and nobody wanted to pay for them - but that's what I tried to set up earlier. That a lot of gamers seem to try to "sniff out" these games like a shark smelling blood and "see them for what they are" so that they don't get scammed - if you're incompetent and can't make an AAA game, they'll see you for what you are. If you can't afford to produce an AAA game, they'll see you for what you are.

I feel like a lot of the disparagement by the public and fear by the developers towards games "looking indie" is still rooted in that sentiment. That Zork, Ultima 4, Sonic 1, and Link to the Past weren't what consumers REALLY wanted, and that the companies were just selling them a stopgap. Now that fully 3D games have come about, selling a game that isn't - or God forbid, being proud of being so - still raises a certain type of forumgoer's hackles. "I'm tired of 2D stuff," they say. It feels like a conspiracy - a scam by indie developers - to them to suggest that 2D games are just as good as 3D games. They feel like 2D stuff is only borne from incompetence or a lack of funds, and that 3D stuff is the platonic ideal that the medium should converge towards.

And, in their defense, it can be kind of frustrating if you buy into what the publishers told you. Ocarina of Time was the "natural evolution" of Link to the Past, and Twilight Princess was the natural evolution on top of that game. Final Fantasy VII is the heir to FFVI, and so FFX is. Even Mario took a break from 2D games for about a decade. It feels like the misguided question "why are there still monkeys around?" as if evolving is the goal for every game. I'd also like to be charitable and point out that if you want something like Super Mario 64, or Ocarina of Time, or Final Fantasy VII... there still is kind of a paucity. Minecraft is the only 3D game that I've seen remotely "pulling a Cave Story," and not only did it get a second coder, but it eventually got its own musicians and artists. "Real" ones, not Notch and Jeb messing around, because now that Microsoft owns it, they don't want to be seen as being incompetent and scamming you into asking for a cheap-looking game.

I guess what I wanted to say is that, in their rush to sell new games, there is a concerted effort by big publishers to tell people that "You didn't really want Zork, Ultima, King's Quest, Sonic, or Zelda. We just couldn't give you what you really deserved, and we're sorry, but now we can! Go buy it!" that makes people think that interactive fiction, RPGs, adventure games, 2D platformers, and 2D action-adventure games are inherently primitive and unlikeable - that anyone who said they liked them way back when was just tricked into thinking they did because they didn't know better. (This extends to the graphics and sound of them, as well.) Sadly, this has killed off a lot of the genres of game I liked as a kid, even the 3D ones such as Petz and The Dog Island. You're either an F2P mobile title or an AAA title, and even that certain genres are only suitable for the former. The AAA companies have successfully tricked a lot of people into thinking that certain genres or even art styles "don't deserve" to be paid for because they are supposedly borne of incompetence or poverty. I wish adults could look at games with child-like eyes again, where they didn't mind that their imagination "filled in the blanks" for the laconic graphics and writing because what the game DID provide was legitimately enjoyable on its own merits.

in reply to @vectorpoem's post:

in reply to @johnnemann's post:

"Arriflex cameras / digital cameras / software video editing have totally levelled the playing field for moviemaking" 🙄

I find myself thinking of how Darren Aronofsky's first film still cost about $150,000 if I remember correctly, and it was only that cheap because he was doing location shooting illegally and cutting other corners like that. and I think pi is great but it's still very obviously a cheap movie, like a better than average student film. Hollywood polish is expensive stuff, and the same's true of AAA games I'm sure.

~Chara