gl-renderer: add alpha cut-out support
Add a support for an alpha cut-out wl_buffer type to the GL renderer.
This is intended for platforms with video underlay rather than overlay, which is often found in embedded systems. Displaying video content on such devices requires transparent region of pixels in the framebuffer on the graphics plane in order to see the (portion of) video plane underneath.
Weston design, as far as I can tell, allows for multiple hardware planes, but assumes that the graphics plane is the bottom-most plane. There may be some additional hardware planes on top of it, for example cursor overlay or video overlay, but not underneath. Weston shell paints the entire background using an opaque colour, which on underlay-type devices entirely covers video plane.
The aim of this change is to make the video underlay usable in Weston.
A wl_buffer containing a decoded video frame is one way of doing video playback, albeit not a most efficient one, as it doesn't use available hardware video pipeline, and sometimes limited or impossible due to content protection.
Considering limited resources on embedded platforms a high resolution content may exceed capabilities of an embedded GPU.
In case of protected content the CPU/GPU may have a very limited visibility and only control the pipeline endpoints: source stream to be decoded and position/size of the video window on the hardware underlay, without any access to individual frames. The alpha-blending of video and graphics planes may only be possible through the scanout hardware in case of protected content.
The proposed solution
This solution uses a new type of wl_buffer that doesn't contain any data and represents a transparent rect in the graphics plane. A "video player" client app creates such wl_buffer to match the desired size of the video window and attaches it to its video (sub)surface. A commit operation results in a "video player" window to be drawn in such a way that the background and any views underneath are overwritten with completely transparent pixels (0,0,0,0) thus creating a punch-through hole that makes hardware video underlay partially visible.
Attaching a regular transparent wl_buffer doesn't solve the issue as it's alpha-blended using SRC OVER Porter-Duff operation so it doesn't make already opaque pixels underneath any less opaque.
Unlike PC-style video overlay, the alpha cut-out for the underlay can be covered in part or in full by regular drawing operations for any views with a higher Z-order that are drawn over the cut-out and the final composition looks identical to the decoded video frame method described above.
Embedded platforms may have different API-s for controlling video pipeline, therefore platform-specific details are intentionally left out. The assumption here is that a platform-specific surface extension will be defined. The alpha-cutout will be used in combination with the video extension. The client app is expected to feed video data and control playback while the compositor is responsible for framebuffer updates in sync with video position/size updates.