Regression in 16bpp between 1.18.3 and 1.18.4
Submitted by Alkis Georgopoulos
Assigned to Xorg Project Team
Link to original bug (#100295)
Description
Created attachment 130337 Xorg.0.log from 1.18.4 on kvm/cirrus with corrupted image
16bpp was working fine with the modesetting driver in xserver-xorg-core=1.18.3, but it's having various issues in 1.18.4, for example this corrupted dual image: https://launchpadlibrarian.net/311563710/VirtualBox_Thin_20_03_2017_10_32_51.png
I've tested xinit -- -depth 16
on 1.18.4 with the following combinations, all of which were working fine in 1.18.3:
VirtualBox: VirtualBox Graphics Adapter [80ee:beef] | vboxvideo | modeset | vboxdrmfb
==> Corrupted small double image
kvm -vga virtio: Red Hat, Inc Device [1af4:1050] | virtio-pci | modeset | virtiodrmfb ==> Corrupted small double image
kvm -vga cirrus: Cirrus Logic GD 5446 [1013:00b8] | cirrus | modeset | cirrusdrmfb ==> Corrupted small single image. It gets fixed if I change resolution with xrandr.
I'm attaching the Xorg.0.log of that last kvm/cirrus test.
Btw, in all those cases if I put "nomodeset" in the command line I end up using the VESA driver which doesn't have the issue.
Attachment 130337, "Xorg.0.log from 1.18.4 on kvm/cirrus with corrupted image":
Xorg.0.log
Version: 7.7 (2012.06)