(this is a rant I had on Mastodon, slightly cleaned up, but still a bit scatterbrained)
Thinking about when people talk about game engines making things more "even" between 3-person indie groups and AAA titles. But it's kinda bullshit. And also irrelevant. Lemme splain.
a few different non-mutually-exclusive notions of how to quantify a game's value in the mix, here. (these are quantitative lenses, offered with full acknowledgment that an artistic medium isn't ultimately made of numbers like this.)
content quantity - how many unique authored pixels/texels, frames of animation, units of accessible world space, seconds of cinematics and recorded audio, goals, interactive permutations, etc are there on the (literal or figurative) disc? how do the systems serve to condense and/or extend this content? games that lean hard on procedural generation systems can obviously be a multiplier on this but even something like random monster spawns or loot tables have tiny effects along this axis. a good engine can ease many developer burdens that limit how much content devs can ship in a game - the same team in parallel universes A and B trying to make the same game with two different engines X and Y - where X is better at, let's call it "content wrangling", than Y - will see the game from universe A ship with more content packed in.
content fidelity - how much unique authored raw information makes in onto the final rendered frame / audio buffer, rendered to its output resolution + framerate? this was the arms race that games became obsessed with around the turn of the 1990s and rode up until around a decade ago, after which a point of noticeable diminishing returns effect kicked in. (this is not to say that high end games aren't still pushing fidelity in 2023 and won't continue for the foreseeable future, going for eg 4K everything with no upsampling, even supersampling from massive textures and 8-digit polycount source meshes, etc. but i think it's safe to say that compared to 20-25 years ago, pure fidelity is no longer the force driving the economics and processes of game development.) by most definitions, a good game engine will be able to throw more pixels and polygons at the screen, and the user's perception of the content's fidelity will be higher. pretty straightforward here.
content quality - the devil of subjectivity enters: how nice does any given bit of content / output look, sound, play? obviously a skilled artist and a novice artist will be able to achieve very noticeably different results with the exact same target resolution. quake 1's textures look gorgeous compared to those of most of its contemporaries. this is a complicated argument that i should probably write an entire book about someday, but good tools allow devs to arrive at higher quality results, more quickly, which allows them to not only do a better job with each piece of content but also to try more things. it's not just about the commonly cited "lowering iteration times" - better tools can fundamentally transform your understanding of what you're making, allow you to "see further", make better decisions and become a wiser, more powerful creator. amazing quality results can and have been achieved even with godawful tools - but hop in your time machine and give those same folks better tools, and you'll get a better game.
labor intensivity - how many person-hours of work went into the game? how many people are on the project? how many hours do they work? what is their quality of life? how efficient do the tools allow them to be? if it sounds weird to speak of these as a major metric of game quality, ask a game dev how much worse a job they do after 20 hours awake + 7-day weeks vs fully rested and unstressed. a good engine can help ensure that each moment devs spend working on a game results in a corresponding quality improvement. and on the whole, game engines and tools have inarguably gotten better at this in the last 20-30 years. we love retro games but nobody still wants to build anything in 3D Studio MAX r1 (1996).
capital intensivity - how much money was spent on the labor and content that resulted in the finished product? this is the most complex equation of all, but it's at the dark heart of the entire games industry (which, it must be said in 100ft flaming letters, is not the same thing as the creative medium of games). when you look at an E3 briefing, you are witnessing a capital expenditure that has been optimized to maximize return on investment.
because capital sits upstream from labor in the process, everything that's true of the relationship between game engines and labor-intensivity is also true of capital-intensivity, with the addition of being able to brag that your platform or game engine is more powerful than your competitors'. and lo, the battlefields of the console wars will run red with slurs, as ever they have.
so obviously, to the OP's point, big budget teams have advantages in almost all of these areas. compared to a typical indie team they can create more content, at a higher fidelity, and throw way more money and thus people at a production. of course, they also pay higher coordination costs, with ubisoft's multinational 1000s-strong projects posing massive logistical and production challenges at the extreme end of the spectrum. broadly, big teams have less agility, less ability to experiment, and can take far fewer risks. 10-15 years ago folks working in what became the "indie games industry" were able to find new niches, revive old ones, and the costs of doing so weren't the deterrent they are for a big corporation. this dynamic has been fundamental to cultural production so long as the concept has existed, and will continue indefinitely.
what i think has changed dramatically in more recent eras is the ways in which tools (including engines) allow smaller teams to "punch above their weight", producing higher quantities of higher quality content with less labor (and thus less capital)1. as an example, i'd argue that id software's rapid rise in the early 90s was due in large part to this commitment to tools - the codebase they shipped Commander Keen 1: Marooned on Mars with allowed them to make tons of other games quickly, increase their skills, try different genres, and get more creatively ambitious. they carried this through to Wolf3D and Doom, using cutting edge NeXT workstations to quickly design 3D levels when most devs were still plodding along making things in 2D tile editors. this kicked the already hot tech and tools races of the 90s into a higher gear, and ultimately once big publishers learned how to harness this dynamic we got modern industrial-scale game production in the 00s, in all its grandeur and horror.
but to be clear i agree with the OP's broad assertions: there is no real "leveling the playing field" tool or tech or process that will let a team of 5 people produce the same work as a team of 50 or 100 people. even if every dev had access to the every tool, small productions have too many inherently differences from large ones. i don't think that invalidates any of the amazing things that better tools can do, and i think general awareness of this is only growing with time, as AAA scrapes the stratospheres of capital-intensivity (often only sustainable with massive consolidation! me reaping: hell yes, etc) and the PS5 gen's hardware architecture resembles a planet-dismantling mass driver for putting money on screen, and more and more of the audience deprograms and steps off the 90s-00s train of thought that says that those pure quantitatives - "more content = better game, more fidelity/money = better game" - determine what they'll find compelling. instead, we'll hopefully continue to have a wider range of folks, making a wider range of stuff, for a wider range of folks, and having a better time doing it.
1 it's also true that lots of design trends of the past decade, from permadeath to procedural worlds to user-created content, have been about wringing a better quality / more valuable game out of the same amount of content - but that's more in the realm of design and creative constraints than tools or technology.
