Layering 2D art into a 3D world with massive particle effects can be REAL hard to get right!
A big driver behind that (its a MASSIVE change!) is to help increase performance on lower-powered PCs and consoles, so we should get acceptable framerates on Switch and even Intel integrated GPUs now (I'll need to dig up an intel IGPU to test that on though...)
Heres an example of some of the fun issues you can get into when you try to mix 2D sprites with 3D worlds. Before we had SO many custom shaders to deal with depth and stencil writing and ordering hacks to try and get everything almost right. All that is now handled with a few URP Rendering features. Big fan, read more below for more details!
So, how was it porting a big Unity project between pipelines? ITS BEEN A MAJOR PAIN IN THE ASS TBH. The automatic conversion tools are NOT as good as I'd have hoped. They can't do anything to custom shaders that are not URP compatible, which you know - thats understandable... but they also fail on basic things like BIRP Standard Particle material ➡️ URP Standard Particle material. It ended up making every single one emissive and screwing up some of the blend settings, so its been a whack-a-mole situation where I've gotta keep my eyes peeled for every single VFX in the game and compare it to a reference build to make sure its rendering the same. Demonschool has some 880~ odd materials at the moment. Thats a lot of materials to double-triple-check!
Lighting was the biggest issue. We NEEDED to port to URP for better lighting performance but it also had the possibility to tank the whole project - URP defaults to a more physically correct light attenuation model... which is fine.. i guess, if you are making something with a photorealistic style? We are not. Basically, the expectation seems to be if you are converting from BIRP to URP, you get an artist to relight your whole game. Thats a no-go for us.
What I ended up doing was forking URP and modifying a bunch of lighting code to make light attenuation work as close to built-in as I could (mostly in the RealtimeLights.hlsl library file). Lemme tell you, that sucked. Maths is hard, I'm not good at it. it took way to long to figure out. Theres a lot of forum posts and Reddit threads about this issue, and I went through SO many solutions until I ended up with something that mostly worked. I DO NOT RECOMMEND DOING THIS. I really think URP needs configurable light attenuation, like, some way to toggle between the older behaviour and more modern physically accurate energy-correct solutions. On a per-light basis even! Not just for people porting from the old renderer, but for people who don't care about photorealism. It's a big hole right now and I REALLY think it needs to be plugged.
Other than that, I'm really happy with lights in URP. The built-in render pipeline supports an UNLIMITED amount of lights per object in its forward renderer... which sounds cool on paper until you realise its a multipass solution. each new light affecting an object triggers a whole extra draw of the object, with each light being blended on top of the last one. On a system like the Switch, this can KILL performance (again, the main reason for the renderer port). We use a lot of lights in Demonschool to hit the right spooky vibes, especially at night, we can sometimes have 4-point lights and one directional... sometimes even more than that... which you know, doesn't sound bad! but when 90% of your onscreen geometry is getting drawn 5-6 times on a low-end GPU, its not a great time. The classic Unity solution for this was to switch the built-in renderer from Forward to Deferred rendering - which decouples lighting from geometric complexity. Cool. Nice. It doesn't work with Orpohographic cameras in Unity.... nice... That's a no-go. So Yeah.. that brings us to URP which by default only supports single pass lighting with its forward renderer, but supports WAY more lights in that single pass, enough for even our most complex lighting cases to all be handled in one pass. This made the whole 3-week engineering cycle worth it by itself!
So, everything we get over better lighting performance has been a nice bonus. My favourite bonus by far is Renderer Features, which is a nice flexible renderer customization hook that lets you throw new steps into the rendering pipeline easily. What this means in practice is we can have INCREDIBLY TIGHT control over render ordering in a way that's not really been possible in Unity before, and its fantastic. That's what directly contributed to fixing all these tricky sorting issues we had in the past. For example, we use a lot of stencil buffer tricks to get character sprites and the floor to work correctly - setting all of that up was a breeze with URP Renderer features. Before we needed a lot of custom shader code and complicated hacks to get things working right - now we can inject a 'Render Objects' rendering feature in right at the start of the pipeline that immediately draws the floors first and handles all the stencil setup. Same deal with our character sprites - we can explicitly set when we want them to be rendered. If your doing anything graphically complex (like Demonschool) in Unity I really recommend checking this stuff out, it might save you some massive headaches!

