When the screen's framebuffer and CPU have different endianness, the colors are displayed swapped (e.g. BGRA becomes ARGB when a big-endian CPU writes to little-endian framebuffer).
To fix this, Mesa needs to detect the framebuffer format and swap the color format when the framebuffer and CPU endianness don't match. In order to detect the framebuffer format, this change adds the SWRast Screen Info extension, that obtains the RGB offsets from X.
For now, only swapped BGRA/BGRX formats are supported.
This is a proposed patch to fix #4190. It was tested on a PowerPC64 machine with a built-in VGA adapter that has a little-endian ARGB 32 framebuffer, with both big-endian and little-endian FreeBSD installs. It was also tested on a big-endian FreeBSD virtual machine, using QEMU, that emulates a big-endian framebuffer. In all cases glxgears displayed correct colors.
Mesa test suite was also run, before and after the change, and no new failures were introduced.
I'm not very familiar with Mesa code, so there are probably
better ways to fix this.
It just seemed that DRI extension mechanism was the cleanest way
to make screen info, exposed by X, arrive at
so that it could be used to select correct color formats.
On the server side, when using Xserver,
implement the same SWRast Screen Info extension (to be published soon),
in order to fix the RGB masks of the X visuals exported by GLX.
About using the RGB masks from the default visual and comparing it with the ones from Mesa color formats, it seemed the best way to detect when the underlying framebuffer used a different endianness. This, however, assumes that the X video driver will adjust the RGB masks properly, which happens in the scenarios tested above, but may not always be the case. An alternative approach, on the client side, could be to use Screen's dpy->byte_order, that seems to correctly report the framebuffer endianness. But on the Xserver side, the equivalent imageByteOrder field is always set to the same endianness of the system, and thus can't be used.