Question about converting between Unity UV space and ThreeJS UV space

Hey all,
Unity UV space goes from 0-1 bottom to top on the Y(V) axis. ThreeJS texture space goes from 1-0. Is conversion between the two meant to be handled manually or by Needle?

For example, here is my code that applies X and Y texture offset values to a material

public setPlayerKit(kitOffset : TextureOffset) {

if (this.playerMesh) {
  let material = this.playerMesh.sharedMaterial as MeshStandardMaterial;
  material.map?.offset.set(kitOffset.xOffset, kitOffset.yOffset);
  this.playerMesh.sharedMaterial = material;
}

}

I am using a texture atlas as a texture on a material. I apply X and Y offsets to have different sets of colours apply to my object. In Unity, everything looks perfect when I apply my offset. However, in the browser, the Y offset is inverted. So for example, if my texture offset values applied are “X : 0.25, Y : -0.625”, this looks perfect in Unity, but not in the browser where the result I see is effectively “X : 0.25, Y : 0.625”. I assume that this is due to the differing UV spaces. I can very easily solve this by simply changing

material.map?.offset.set(kitOffset.xOffset, kitOffset.yOffset);

to

material.map?.offset.set(kitOffset.xOffset, -kitOffset.yOffset);

Im just wondering if its correct for us to handle this. or if for the sake of consistency between editor and browser, this conversion is meant to be handled by Needle internally?

Hello Nostalgicbear,

The issue you’re encountering is indeed related to the differences in UV mapping between Unity and Three.js. Unity uses a bottom-to-top UV space, while Three.js uses a top-to-bottom UV space.

Since Needle Engine does not automatically handle this conversion for you, it’s perfectly reasonable for you to implement the conversion manually in your code. Your solution of negating the Y offset to account for this difference is a practical approach. This allows you to maintain consistency between how textures are applied in Unity and how they appear in the browser.

Here’s the code snippet you provided with the suggested adjustment:

public setPlayerKit(kitOffset: TextureOffset) {
    if (this.playerMesh) {
        let material = this.playerMesh.sharedMaterial as MeshStandardMaterial;
        material.map?.offset.set(kitOffset.xOffset, -kitOffset.yOffset); // Negating the Y offset
        this.playerMesh.sharedMaterial = material;
    }
}

This means you have correctly identified that the conversion should be handled in your application code.

If you want to ensure consistency across different platforms or if Needle decides to handle this internally in future updates, you could encapsulate this logic in a utility function to centralize the UV conversion.

Feel free to ask if you have further questions or need additional assistance!

3 replies left
thread_P6bc3ssMDg1vCfY44gGK6E9B
Reply to this message to continue the conversation

Heyhey, the bot is partially right here –
for all cases where it’s going through our exporters it’s handled by Needle. That means:

  • exporting materials and textures just works
  • animating UV offsets just works

But, when you’re working on runtime code, then you’re not in “Unity land” anymore, you’re indeed in three.js land, so you indeed have to use the conventions there and set texture coordinates and offsets in the right space.

Thanks @Felix_Herbst . That makes sense. I suspected as much but just wanted to confirm. Thanks for providing clarity. :slight_smile:

1 Like

This topic was automatically closed 13 days after the last reply. New replies are no longer allowed.