anv: large perf delta between Red Dead Redemption 2 on Linux vs Windows
System information
- Wine/Proton version: Proton Experimental
System:
Host: renatopereyra-ubuntu Kernel: 5.15.0-58-generic x86_64 bits: 64
compiler: gcc v: 11.3.0 Desktop: N/A wm: gnome-shell dm: GDM3
Distro: Ubuntu 22.04.1 LTS (Jammy Jellyfish)
CPU:
Info: quad core model: 11th Gen Intel Core i7-1165G7 bits: 64 type: MT MCP
arch: Tiger Lake rev: 1 cache: L1: 320 KiB L2: 5 MiB L3: 12 MiB
Speed (MHz): avg: 2636 high: 3812 min/max: 400/4700 cores: 1: 1746
2: 3812 3: 2833 4: 1716 5: 2504 6: 2988 7: 2484 8: 3011 bogomips: 44851
Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx
Graphics:
Device-1: Intel TigerLake-LP GT2 [Iris Xe Graphics] vendor: Dell
driver: i915 v: kernel ports: active: eDP-1
empty: DP-1, DP-2, DP-3, HDMI-A-1, HDMI-A-2, HDMI-A-3
bus-ID: 0000:00:02.0 chip-ID: 8086:9a49
Device-2: Microdia Integrated_Webcam_HD type: USB driver: uvcvideo
bus-ID: 3-3:3 chip-ID: 0c45:6d13
Display: server: X.Org v: 1.22.1.1 compositor: gnome-shell driver:
gpu: i915 note: X driver n/a display-ID: :0 screens: 1
Screen-1: 0 s-res: 1920x1080 s-dpi: 96
Monitor-1: XWAYLAND0 mapped: eDP-1 model: Sharp res: 1920x1080 dpi: 168
diag: 337mm (13.3")
OpenGL: renderer: Mesa Intel Xe Graphics (TGL GT2)
v: 4.6 Mesa 22.3.4 - kisak-mesa PPA direct render: Yes
Describe the issue
When configured to the same settings, the in-game RDR2 benchmark performs vastly differently between Linux and Windows on the same dual-booting machine. This is using the game's Vulkan backend so it should be possible to rule out any problems in Direct3D to Vulkan translation. The issue is consistently reproducible and multiple back-to-back replays do not appear to affect the overall performance. Screenshots of settings and benchmark results: https://drive.google.com/drive/folders/1nW4gmk6ajyp7_SXLYji2621PG7KJL4e6?usp=sharing
Still reproduces as of d745e3b0.
Note that the settings auto-selected by the game based on the quality slider vary and appear to depend on the amount of VRAM reported by the driver. Each setting needs to be manually toggled in order to actually match the Windows settings in Linux since the two report different available VRAM. This applies to the locked settings as well. I couldn't get "Geometry Level of Detail" in Linux to match the Windows value (~50%) so I opted to make it slightly lower (~40%).