Performance regression in radv affecting Guardians of the Galaxy
Description
There are two notable problems with the game, one is that VRAM gets filled during the benchmark, causing lower performance. And the other is that after the benchmark runs and VRAM usage goes down, performance remains poor, with the main menu going from over 100 FPS to under 20 FPS. In older RADV versions, VRAM usage is lower and doesn't display this bad performance.
I tested the epic games store version of the game.
Screenshots/video files
Steps to reproduce
You can access the benchmark by going into the settings. In affected radv versions, vram will be filled at the very start of the benchmark.
System information
Host: DeepBlue Kernel: 6.7.2-arch1-2 arch: x86_64 bits: 64 compiler: gcc
v: 13.2.1 Desktop: KDE Plasma v: 5.27.10 tk: Qt v: 5.15.12 wm: kwin_x11 dm:
1: LightDM note: stopped 2: SDDM Distro: Arch Linux
CPU:
Info: 6-core model: AMD Ryzen 5 5600G with Radeon Graphics bits: 64
type: MT MCP arch: Zen 3 rev: 0 cache: L1: 384 KiB L2: 3 MiB L3: 16 MiB
Speed (MHz): avg: 2948 high: 3865 min/max: 400/4464 cores: 1: 2994 2: 2993
3: 2992 4: 2991 5: 3865 6: 400 7: 2991 8: 3033 9: 2991 10: 3265 11: 3767
12: 3096 bogomips: 93456
Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a ssse3 svm
Graphics:
Device-1: AMD Navi 23 [Radeon RX 6600/6600 XT/6600M] vendor: ASRock
driver: amdgpu v: kernel arch: RDNA-2 pcie: speed: 16 GT/s lanes: 16 ports:
active: HDMI-A-1 empty: DP-1,DP-2,DP-3 bus-ID: 03:00.0 chip-ID: 1002:73ff
Display: x11 server: X.Org v: 21.1.11 with: Xwayland v: 23.2.4
compositor: kwin_x11 driver: X: loaded: amdgpu
unloaded: fbdev,modesetting,vesa dri: radeonsi gpu: amdgpu display-ID: :0
screens: 1
Screen-1: 0 s-res: 1680x1050 s-dpi: 96
Monitor-1: HDMI-A-1 mapped: HDMI-A-0 model: LG (GoldStar) TV
res: 1680x1050 dpi: 90 diag: 558mm (22")
API: EGL v: 1.5 platforms: device: 0 drv: radeonsi device: 1 drv: swrast
surfaceless: drv: radeonsi x11: drv: radeonsi inactive: gbm,wayland
API: OpenGL v: 4.6 compat-v: 4.5 vendor: amd v: N/A glx-v: 1.4
direct-render: yes renderer: AMD Radeon RX 6600 (navi23 LLVM 16.0.6 DRM
3.56 6.7.2-arch1-2) device-ID: 1002:73ff
API: Vulkan v: 1.3.276 surfaces: xcb,xlib device: 0 type: discrete-gpu
driver: mesa radv device-ID: 1002:73ff
If applicable
- Xserver version: 1.21.1.11
- vkd3d-proton: v2.11-19-g0e681135.
- Wine/Proton version: 8
Regression
I have bisected and the first commit with this problem is this 374bd4e1
Traces
Further information (optional)
Might or might not be related to the issue, but in affected versions, the game detects available vram to be 24GB (incorrect) instead of 8GB (correct).
Windows also reports having 24GB when the VRAM gets full.
After further testing I found that amdvlk is also running out of vram (though its performance is better than radv when this happens), so I decided to test other vkd3d-proton versions and found that the problem doesn't happen in 2.10, and that it started somewhere between that and 2.11.
Edit: vkd3d-proton 2.11 enables dxr by default, and that along with the commit above together trigger the issue, if the user set VKD3D_CONFIG=nodxr
the issue won't appear.
Added rmv traces when running the game with dxr and nodxr.