Client buffer scaling is broken with Pixman renderer
When displaying client surfaces whose scale factor does not match the native scale factor, Pixman renders everything (at least alpha-blended views ... ?) very badly. It seems like either view damage from moves is not properly processed, or blending is just outright broken.
The attached screenshot is the result of hacking libweston/compositor.c
to always send wl_output.scale(1)
to clients instead of the real scale factor, and starting the DRM backend with scale == 2
for the outputs. The same effect could be achieved by ignoring the scale event in clients/window.c
.
I suppose this means there are no tests for surface->scale != output->scale
paths.