[Gallium/Crocus] [Performance Regression] Using Color Depth of "16" causes some programs to use Color Depth of "32" which decreases performance by around 37% compared to the "i965" classic driver correctly setting Color Depth of "16" in the same programs
System information
- OS: (
Ubuntu 22.10
) - GPU: (
Intel HD 4600
) - Kernel version: (
5.18.7
) - Mesa version: (
Mesa 22.3.0-devel (git-3ef88cd 2022-10-28 kinetic-oibaf-ppa
) - Xserver version: (
X.Org X Server 1.21.1.4
) - Desktop manager: (
OpenBox
)
Describe the issue
Using the Color Depth of "16" (startx -- -depth 16
) causes some programs to use the Color Depth of "32" which decreases performance by around 37% on the "crocus" gallium driver.
But using the "i965" classic driver correctly sets Color Depth of '16' in the same programs.
Tested compiling mesa 21.3 with the "crocus" gallium driver and the "i965" classic driver, but same problem.
Programs like "glxgears" and Godot games get effected by this issue and can decrease performance by around 37%.
But using the Color Depth of "24" (startx -- -depth 24
) correctly sets the Color Depth of "24" on those same programs on both the "crocus" gallium driver and the "i965" classic driver.
I wanted to increase performance by being able to set the Color Depth of "16" instead of the Default Color Depth of "24". But using the Color Depth of "16" causes some programs to use the Color Depth of "32" on the "crocus" gallium driver. I dont know if this problem happens on other gallium drivers.
Steps to Reproduce
1. Use command startx -- -depth 16
2. Open terminal and use command glxgears
3. Open another terminal and use command xwininfo
and click on the glxgears windows
4. Depth should show Depth: 32
if using the "crocus" gallium driver (But shows Depth: 16
if using the "i965" classic driver)
Screenshots/video files
These are screenshots of an edited Godot template game with VSync disabled (for testing performance) and FPS counter added. The FPS is on top left and the xwininfo
showing the Color Depth between using the "crocus" gallium driver and the "i965" classic driver while the Color Depth set to "16" (startx -- -depth 16
) is on the bottom right. These screenshots show 37% of performance being lost because of the Color Depth being set to "32" instead of "16" when using startx -- -depth 16
when using the "crocus" gallium driver.
Alternative solution found
In the file "src/gallium/frontends/dri/dri_screen.c", changing
if ((depth_bits[k] + stencil_bits[k] == 16) !=
(red_bits + green_bits + blue_bits + alpha_bits == 16))
continue;
to
if ((depth_bits[k] + stencil_bits[k] == 32) !=
(red_bits + green_bits + blue_bits + alpha_bits == 16))
continue;
correctly sets the Color Depth of "16" to programs like "glxgears" and Godot games instead of it setting the Color Depth to "32" on the "crocus" gallium driver.
But it causes programs that force the Color Depth of "32" like the "Run" button in "Unigine Heaven 4.0" to not be able to start on the "crocus" gallium driver because of "Xorg" errors. But clicking the "Run" button in "Unigine Heaven 4.0" with the "i965" classic driver successfully starts the benchmark with the Color Depth of "32" even though the Color Depth of "16" (startx -- -depth 16
) is set.