Anyone familiar with Blender know if it is possible to convert "Object" or "Generated" texture coordinates into a UV Map?
I am making something with procedural geometry in nodes which uses a high-resolution tiled texture, for Unity. VRChat specifically. It’s not something that can be reasonably baked due to scale, and the fact this is intended to render on the Oculus Quest 2, so texture memory will be limited…
But… I’m coming up bupkiss on results for this stuff. UV unwrapping in geometry nodes just… doesn’t work for this. And I’m unable to find any solutions to my problem. The only stackexchange post on this question resulted in "Bake the texture to the model" (which isn't an answer; it's circumnavigation), but as stated, I have very good reason not too (this material, as is, uses high-resolution photogrammetry!)
When using a node shader set to Object or Generated coordinates, it almost wraps perfectly. There's only a couple areas where the wrapping would need adjust, here and there. But you can't use Blender shader nodes in Unity.
I need help.
To clarify:
The object being generated in geometry nodes is agnostic to its base geometry — That is, I am taking an "input" geometry, and passing it in node to a series of functions to manipulate it.
The problem with this is that I do not understand how procedural unwrapping works, and I am struggling to figure it out. "Normal" methods of unwrapping procedural geometry, where it's a simple rock or something else, do not work. I am applying this procedural geometry to an entire structure — a volume with dimensions and curves and corners, etc.
And because this is designed for a Unity based project, UV mapping is required. So I cannot use the Object texture coordinate space nor the "generated" texture coordinate space. — I don't even understand what Object texture coordinates even are, just that they work.



