sudo-EatPant

I'm using tilt controls!

am 27 | furry & retro tech | very bi | operator of @UCVRCG



MisutaaAsriel
@MisutaaAsriel

Anyone familiar with Blender know if it is possible to convert "Object" or "Generated" texture coordinates into a UV Map?

I am making something with procedural geometry in nodes which uses a high-resolution tiled texture, for Unity. VRChat specifically. It’s not something that can be reasonably baked due to scale, and the fact this is intended to render on the Oculus Quest 2, so texture memory will be limited…

But… I’m coming up bupkiss on results for this stuff. UV unwrapping in geometry nodes just… doesn’t work for this. And I’m unable to find any solutions to my problem. The only stackexchange post on this question resulted in "Bake the texture to the model" (which isn't an answer; it's circumnavigation), but as stated, I have very good reason not too (this material, as is, uses high-resolution photogrammetry!)

When using a node shader set to Object or Generated coordinates, it almost wraps perfectly. There's only a couple areas where the wrapping would need adjust, here and there. But you can't use Blender shader nodes in Unity.

I need help.


To clarify:

The object being generated in geometry nodes is agnostic to its base geometry — That is, I am taking an "input" geometry, and passing it in node to a series of functions to manipulate it.

The problem with this is that I do not understand how procedural unwrapping works, and I am struggling to figure it out. "Normal" methods of unwrapping procedural geometry, where it's a simple rock or something else, do not work. I am applying this procedural geometry to an entire structure — a volume with dimensions and curves and corners, etc.

And because this is designed for a Unity based project, UV mapping is required. So I cannot use the Object texture coordinate space nor the "generated" texture coordinate space. — I don't even understand what Object texture coordinates even are, just that they work.


You must log in to comment.

in reply to @MisutaaAsriel's post:

Object texture coordinates are three dimensional, but UV maps only have two, so there isn't an easy direct way to do it.

But you say you're using a high resolution tiled texture, do you have this set to Box projection so that what you get is something like this?
Object texture coordinates

If so, your closest bet is to do a Cube Projection unwrap, and scale up the uvs real big until you get the repeating similar:
Box projected UVs

Do you have this set to Box projection so that what you get is something like this?

I do not. In the shader nodes I am going directly from the "Object" out-node to the "Vector" in-node. There is no "mapping" node.

Object texture coordinates are three dimensional, but UV maps only have two, so there isn't an easy direct way to do it.

Yes but Blender has to be converting it "under the hood" anyways to map the texture to the object. So there's got to be a way to do it.

I'm just left scratching my head because I genuinely do not understand what's going on with the "object" texture coordinates. It's not perfect; some triangles have stretched UVs, but using Object Coordinates in shader node is almost perfect. If I could somehow convert that to a map and then edit the UV as needed It'd work.

what I did was a quick and dirty recreation of what object coordinates do.
I can't be sure why that wouldn't work without seeing more of your setup but this is a more complete recreation of the effect
More Complete Object coordinate Nodes
if you're not using a target object you'd just hook up the self object node to the second object info node