Why does unity obj import flip my x coordinate?

Why does unity obj import flip my x coordinate?

When I import my wavefront obj model into unity and then draw lines over it with the same coordinates in the obj file, the x coordinate is negated.
I don't see any option in the importer that might be doing that.  And I'm using the same localToWorldMatrix and the same coordinate data in the .obj file.  Hmmm
GL.PushMatrix();
GL.MultMatrix(transform.localToWorldMatrix);

CreateMaterial();
lineMaterial.SetPass(0);

GL.Color(new Color(0, 1, 0));
GL.Begin(GL.LINES);

GL.Vertex(p1);
GL.Vertex(p2);

GL.Vertex(p2);
GL.Vertex(p3);

//...

GL.End();
GL.PopMatrix();

Solutions/Answers:

Answer 1:

I had quite a discussion about this with Unity customer support, which you can see here. The short of it is this:

  1. The actual OBJ file format specification declares that, “A right-hand coordinate system is used to specify the coordinate locations.”

  2. Unity uses a left-hand coordinate system.

  3. Conversion from right-handed to left-handed is accomplished by negating the coordinates on any axis (it doesn’t matter which one).

  4. Unity negates the X-coordinates to convert the right-handed OBJ data to left-handed data.

As far as I can tell, this is not documented by Unity. But, if you look at that bug report and the dialog I had with Unity’s rep, it is the intended behavior.

Note that none of this depends on any modeling app. I did my tests with a file I created in Notepad. It all comes down to the OBJ file being specified as using a right-hand system, Unity using a left-hand system, and Unity negating the X-coordinate to convert from one to the other.

Answer 2:

The difference between a r/h coordinate system & a l/h coordinate system is that the x axis is negated. I wonder if the object was created in a modeling app that uses one system while your development framework uses the other.

Answer 3:

Look at the mesh instead of the game object in the inspector and see the orientation. Most of the times, exporters apply an overall transform.

Answer 4:

I was seeing a similar issue. To investigate further I added an editor script like this, to log the vertices of the mesh I applied it to:

using UnityEngine;
using System.Collections;
using System;

[ExecuteInEditMode]
public class TestScript : MonoBehaviour 
{
    void OnEnable()
    {
        Mesh mesh = GetComponent<MeshFilter>().mesh;
        Vector3[] vertices = mesh.vertices;
        foreach (var vertex in vertices)
            Debug.Log(String.Format("{0} {1} {2}", vertex.x, vertex.y, vertex.z));
    }
}

This logged coordinates like this:

-42.4 -6.608938 -1.6
-42   -6.579293 -1.6
-42.4 -6.652683 -1.2
-42.4 -6.608938 -1.6

Whereas my original .obj file had vertices like this:

v 42.4000015258785 -6.60893774032594 -1.60000002384146
v 42.4000015258785 -6.65268325805652 -1.20000004768452
v 42.0000000000008 -6.57929277420054 -1.60000002384146
v 42.0000000000004 -6.57929277420055 -1.60000002384106

So as Steve H suggested, it does look like Unity is negating the X values while importing. It must be assuming that the .obj file is right-handed, and so it converts to Unity’s left-handed coordinate system by negating the X.

It looks like Unity also does some re-ordering / optimisation of the mesh, as the order and number of the vertices in Unity is not the same as the original .obj file either.

References