Unity Texture2D raw data TextureFormat problem

Unity Texture2D raw data TextureFormat problem

I am trying to load raw image data into a Texture2D in Unity using LoadRawTextureData but I can't get it to work correctly.
First I receive an array of bytes from an unmanaged function that captures a window. I am pretty sure these bytes represent an image in BGRA32 format.
int dataSize = width * height * 4;
byte[] imgData = new byte[dataSize];
bool success = CaptureWindow(windowInfo, imgData, (uint)(dataSize));

Then I try to create a new Texture2D with this data
Texture2D tex = new Texture2D(width, height, TextureFormat.BGRA32, false, true);
tex.LoadRawTextureData(imgData);
tex.Apply();

And finally I create a new plane and assign this texture to its material.
GameObject plane = GameObject.CreatePrimitive(PrimitiveType.Plane);
plane.GetComponent().material.mainTexture = tex;

This doesn't work (Notice how the color channels are mixed up):

However, when I reverse the byte array before loading it into the texture as ARGB32, it does work:
Texture2D tex = new Texture2D(width, height, TextureFormat.ARGB32, false, true);
Array.Reverse(imgData);
tex.LoadRawTextureData(imgData);
tex.Apply();


The problem with this solution is that Array.Reverse is too slow.
Why doesn't loading it as BGRA32 work? Isn't BGRA32 just the reverse of ARGB32?
FYI: I am using Unity 5.1.2f1 on Windows and SystemInfo.SupportsTextureFormat returns true for both BGRA32 and ARGB32.

Solutions/Answers:

Answer 1:

I think the key point is that the Array.Reverse just reverses the whole array but ARGB just reverses the order per pixel.

For example, I have 3 pixels with BGRA32 like (1,2,3,4), (5,6,7,8), (9,10,11,12). Notice that the semicon does not exist, it just helps to identify the pixels.

If you use Array.Reverse, it’ll be (12,11,10,9), (8,7,6,5), (4,3,2,1).
When using ARGB32, it’ll be (4,3,2,1), (8,7,6,5), (12,11,10,9).

That’s the difference, I hope this’ll help you.

References