#tech art

Short update, I've been working on cleaning some things up and solving some issues I left for later in order to scale up to a fully functional pipeline and make the ~50ish islands that I'm projecting I'll need for the demo. One of those things is that the stream/waterfall generation system that I was using only worked for islands that had a hole in the middle topologically (Doughnut Islands). This is because the way I was generating the streams was to connect points scattered on the inner and outer edges of the island, in this way guaranteeing the streams will always cross the island from edge to edge. Diagram showing edge connection The problem is in the case where there are no holes, how do I decide which points to connect in a similar way? I basically just want to avoid the case where the two ends of the river end up on the "same side" of the island, instead having them go across lengthwise, like this Diagram showing desired edge connection Unfortunately there isn't a unique way to say which "side" is which on something that's topologically a circle. The solution that I used is to cluster the points into two clusters based on their distance around the edge of the island, so that points closer in edge distance are grouped together. Houdini cluster node This doesnt work perfectly in every situation, but it works good enough! I'd like to more or less redo the stream generation in the future to do something a bit less ad-hoc anyway so it's Good Enough For Now.

A floating, sparkling island among the clouds surrounded by clouds

Gonna start posting some devlogs and stuff about my video game On Here, hope someone finds it interesting and vibes with my personal fixations

Firstly, what is the game:

You play as a member of the Crystal Scouts--organized by the world's universities to verify the existence of the last undiscovered Crystal Classes in order to prove a theory which predicts the world's end.

If you want a Genre-Centric Description it's a Small Open World (Science?) Fantasy First Person Exploration Game I Guess

The presented gif shows progress on some procedurally generated floating islands I'm working on for the game, and I want to rant a bit about my workflow and art direction

The island's geo is made 100% procedurally in houdini, and the shaders don't use any hand made textures and are driven mostly by procedural noise and vertex data from the meshes. rn the clouds in the scene are just arranged by hand to have some background there, but in the future will be generated along with the full level layout

I'll put a couple more technical details at the end, but the main thing I want to talk about is why I decided to use a fully procedural environment art workflow in the first place

There are two main reasons:

  1. I found that it was very hard to get a feel for whether the level design of the game works, or even whether the game works as an experience at all, in a greyblock. I just couldnt answer many questions without high quality art, because it's essential to the experience of the game. This is a similar problem to the one they experienced prototyping Firewatch (43m), which is what inspired me to this full-vertical-slice prototyping workflow. Going with a procedural art workflow is a big part of the solution to this problem, as it lets me as a single person iterate quickly on the level design and create high quality art for it in a fraction of the time it would take by hand

  2. Sticking to procgen is a self-imposed "limitation" stemming from my art direction and world design. The world has an elaborate fictional geology, and I want the environment to directly reflect that as much as possible. Meaning, whenever possible I want the environment to be generated (or at least rationalized) by the logical processes of the world, and using a fully procedural workflow both encourages and enables that

#2 isn't to say that isn't to say that there isn't any human touch to the art direction of course, there's still a lot of it. One of the things I considered originally is trying to be extremely high concept and generate the world entirely from its first principles of physics. If I had an infinite amount of time maybe I'd try this, but I decided that isn't really what I wanted (and its, a little bit, too ambitious). Truthfully I don't really like programming that much, I have specific images and feelings in my head that I want to make appear and to an extent I would much rather play with shapes and colors with my hands (mouse).

But, I still wanted to thoroughly incorporate my ideas for the world's physics into the environment design because it's pretty important to the game design and world building and also the concept is just much very my jam. So the compromise that I ended up with was, the high level direction for the environment always needs to come from some heuristic interpretation of the first principles of the world's geology. Since I'm not generating it precisely from first principles, the creative direction comes from choosing a high level interpretation of those principles that creates something in line with what I like aesthetically, but is still always guided by a certain consistent logic. The environment concept that I have for the prototype that is based on these floating islands is one such interpretation. Working in a completely procedural workflow is perfect for this, because at the end of the day there always is some logical process by which the environment is generated that I can point to

Apologies for being fairly abstract here but idk how much I want to talk about specifically what these "geologies" and "processes" and "laws of physics" are since finding that stuff out is part of the game I think :>

So my next step is iterating on and polishing these islands a bit more, and then moving on to generating the rest of the environment (clouds mostly, lots of clouds)

Finally a couple more promised technical details for those inclined--

The base input to the algorithm presently is a procedural noise which is used to generate the base island shapes (I'll most likely change how this is done in the future once the level design is more solidified). The generation is run offline because it takes ~5min per island (can def be improved with some optimization) and the game design doesn't necessitate real-time generation (this would also require a completely different workflow)--the meshes are just written out as fbx from houdini and loaded into unreal. The only output from houdini is the meshes: im as much as possible avoiding using textures and relying only on vertex data and cheap procedural noise for shading. There are a few reasons for this, but the one relevant here it makes one thing much easier about my workflow which is that I don't have to worry about generating UVs procedurally (which is annoying to do right). In the future I'll probably make some more involved technical posts about the specific techniques I used in houdini and unreal to make this

That's it thanks for reading :host-plead:

hecccc yessss its taken all weekend - pls don't ask me how much sleep i've had - but hypehypehype we got a working first draft of our custom 2D lighting model in the game at last!

I'm super fucking chuffed with this, it's taken a whole lot of tinkering (and there's plenty more left to do!) but its starting to look like the target renders our artist @cdpnk put together all those moons ago.

Continuing on my saga of dealing with giant PSD files, yesterday I went into a state of hyperfocus and uhhh... so now I have a lil C# app I can use for reading the giant PSDs, and automatically slicing them into tiles, Aaaaand it merges all the alpha channels from the shadow/highlight/blackout layers into the RGBA channels of one matching PNG 🤯

Was fun to learn how to make a WPF app... and how to read a PSD file.

It still looks like garbage, and there's a lot of room for perf enhancements, but it doesn't devour all of my ram every time I run it, and it takes about 2 minutes to slice a big ol' file, so I'ma call that a win.