NireBryce

reality is the battlefield

the first line goes in Cohost embeds

🐥 I am not embroiled in any legal battle
🐦 other than battles that are legal 🎮

I speak to the universe and it speaks back, in it's own way.

mastodon

email: contact at breadthcharge dot net

I live on the northeast coast of the US.

'non-functional programmer'. 'far left'.

conceptual midwife.

https://cohost.org/NireBryce/post/4929459-here-s-my-five-minut

If you can see the "show contact info" dropdown below, I follow you. If you want me to, ask and I'll think about it.


MisutaaAsriel
@MisutaaAsriel

Anyone familiar with Blender know if it is possible to convert "Object" or "Generated" texture coordinates into a UV Map?

I am making something with procedural geometry in nodes which uses a high-resolution tiled texture, for Unity. VRChat specifically. It’s not something that can be reasonably baked due to scale, and the fact this is intended to render on the Oculus Quest 2, so texture memory will be limited…

But… I’m coming up bupkiss on results for this stuff. UV unwrapping in geometry nodes just… doesn’t work for this. And I’m unable to find any solutions to my problem. The only stackexchange post on this question resulted in "Bake the texture to the model" (which isn't an answer; it's circumnavigation), but as stated, I have very good reason not too (this material, as is, uses high-resolution photogrammetry!)

When using a node shader set to Object or Generated coordinates, it almost wraps perfectly. There's only a couple areas where the wrapping would need adjust, here and there. But you can't use Blender shader nodes in Unity.

I need help.


To clarify:

The object being generated in geometry nodes is agnostic to its base geometry — That is, I am taking an "input" geometry, and passing it in node to a series of functions to manipulate it.

The problem with this is that I do not understand how procedural unwrapping works, and I am struggling to figure it out. "Normal" methods of unwrapping procedural geometry, where it's a simple rock or something else, do not work. I am applying this procedural geometry to an entire structure — a volume with dimensions and curves and corners, etc.

And because this is designed for a Unity based project, UV mapping is required. So I cannot use the Object texture coordinate space nor the "generated" texture coordinate space. — I don't even understand what Object texture coordinates even are, just that they work.


MisutaaAsriel
@MisutaaAsriel
  • Generated texture space? Looks beautiful! Exactly what I want! Perfection.
  • UV Mapped? Only horizontal surfaces (e.g. the floor) look remotely right. Everything else is a hot mess.
    • Inspecting the UV it looks like it was just projected from view, top down. :u
    • UV is unwrapped using a method that was recommended to me in geometry nodes, but it clearly ain't right.

I need this to be represented the same. And I need to use a repeatable texture as it is done in Blender, not bake the texture into a new file that is ridiculously large.

But How???

Like, surely there is a way to take the way Blender is representing the texture on the faces, and convert that into a UV map itself???


You must log in to comment.

in reply to @MisutaaAsriel's post:

Object texture coordinates are three dimensional, but UV maps only have two, so there isn't an easy direct way to do it.

But you say you're using a high resolution tiled texture, do you have this set to Box projection so that what you get is something like this?
Object texture coordinates

If so, your closest bet is to do a Cube Projection unwrap, and scale up the uvs real big until you get the repeating similar:
Box projected UVs

Do you have this set to Box projection so that what you get is something like this?

I do not. In the shader nodes I am going directly from the "Object" out-node to the "Vector" in-node. There is no "mapping" node.

Object texture coordinates are three dimensional, but UV maps only have two, so there isn't an easy direct way to do it.

Yes but Blender has to be converting it "under the hood" anyways to map the texture to the object. So there's got to be a way to do it.

I'm just left scratching my head because I genuinely do not understand what's going on with the "object" texture coordinates. It's not perfect; some triangles have stretched UVs, but using Object Coordinates in shader node is almost perfect. If I could somehow convert that to a map and then edit the UV as needed It'd work.

what I did was a quick and dirty recreation of what object coordinates do.
I can't be sure why that wouldn't work without seeing more of your setup but this is a more complete recreation of the effect
More Complete Object coordinate Nodes
if you're not using a target object you'd just hook up the self object node to the second object info node

in reply to @MisutaaAsriel's post:

Hi! I'm sorry I never got back to you, life has been rough, but I think I know what went wrong.

I'd like to see your shader nodes because I'm guessing you have box project set in your image texture node, because the behavior you've been explaining seems kind of weird to me and that would explain it, if that's the case there's no perfect way to bake that to a UV map, since it's able to do the projection calculation per pixel, whereas with geometry node UV we're stuck per vertex, but you can still get close.

Blender Node Tree

Here's the whole node setup, probably could of organized it better but oh well.

Blender Node Tree

Generated coordinates are just using the position of the vertex, what box projection would be doing is, picking the best pair of coordinates to sample the UV based on which face of a unit cube the normal points to.
this part here splits of those planes, and I've got this picture to get a better look at how they're plugged into the switch nodes.

Blender Node Tree

This is to find which of our coordinate planes (separated above) the vertex normal points to most.
then the output of the switch is fed into the store attribute node like before.

Some downsides, as stated before there's some precision issues where the normal is pointing ambiguously to any of the planes, particularly corners. (though that shouldn't be a huge problem unless the corners are drastically rounded)

It also favors the top plane in those corner cases because of the order of comparison here

(there's probably an improved way to do this so I'll keep hammering at it but this should be a lot better than what I sent last time)

No dice.

Poorly mapped textures on a cave wall; It didn't work... It's honestly worse than some of the other attempts, though in some ways it is kind of better? It's not just consistent vertical or horizontal lines anyways…

The Generated Shader Nodes

Shader Node Tree It's not doing much at all, it's literally just using "Generated" as the vector input to the textures. The multiplier is to increase tiling level, similar to Unity (so I can play with it in Blender).

When using the UV mapped shader it literally is just doing the same but with the UV map node instead of the object's generated texture coordinates…

Weird thing I noticed, throwing things at the wall: when I add a "cross product" vector math node into the UV mapped shader's node, with the second vector being somewhere around 0.8,0.4,0.8, it looks significantly better. It's still significantly wrong, but it is better…

UV Mapped shaded cave Shader Node Tree

The primary issue is that the mapping is more noticeably tiled, and inconsistent based on the facing direction (the back wall is more noticeably stretched, as are some other walls; you can see this a little looking into the passage way in the screenshot).

Also, note that this doesn't work if I try this in the geometry nodes, only the shader nodes.

 


 

If that's the case there's no perfect way to bake that to a UV map, since it's able to do the projection calculation per pixel, whereas with geometry node UV we're stuck per vertex, but you can still get close.

See, I don't get this. It's still displaying the texture, in some fashion, on the faces at the end of the day, which means that that mapping is possible. It's still showing portions of the texture on triangles, even if how it initially got to that point was without a UV map. Why can't I then "bake" that representation, that mapping of each triangle to the texture, into a UV map? — I don't care about strange UV layout and overlapping UVs in UV0 because UV1 exists for this reason: a separate UV for lightmapping. It's not like the mapping of the texture changes with the camera angle either, it is static.

Blender Node Tree

the immediate problem is you're still using box project.
unity has no out of the box way to represent this setting, since you're doing these materials to see how they'd look in unity you need to be using flat.

what box project is basically doing is taking a 3 dimension UVW coordinate instead of a UV coordinate, which is why you're still getting that stretching, there's no W coordinate in the uv map for it to reference. (and there can't be)

Sphere
for some clarity this is the results I'm getting, and that strange behavior on the corner is what I mean by not being perfect, though I could get rid of it with a small adjustment to the node setup

the reason you can't just bake it, is because while it does at some point need a 2d coordinate to sample the texture, blender is creating that 2d coordinate from the 3d coordinate per pixel, and there's no way to mimic that behavior without a custom shader in unity since we can only store per vertex data. (blender's basically just giving you triplanar mapping for free here, whereas there's no equivalent in unity)