crocus/Sandy Bridge: The game "A Hat in Time" does not start up with Gallium Nine
Summary
This bug report is an "off-spring" of bug #7247 (closed) which is r600 related. Although the observed symptoms are different, the original problem cause is most likely the same.
When I start the game A Hat in Time on my Sandy Bridge based Intel HD 2000 iGPU the program aborts after some time automatically by itself. Only the start up splash screen of the game is visible for a while.
The underlying problem was already bisected by Axel Davy when he fixed this for radeonsi, see MR !9578 (merged).
More information can be found in bug #7247 (closed), - I quote Axel Davy:
I remember this game has issues running on radeonsi because the nir generated and kept in memory was too big. The patches I introduced enabled to reduce this issue and make the game run on radeonsi. I guess r600 must be keeping too many version of the nir, just like radeonsi used to. The game compiles a massive amount of shaders. You must be mistaken about it being 64 bit else it wouldn't have the issue. One workaround would be to not compile the shaders before their first use, basically introducing stuttering when the game uses the shader for the first time, but saving the memory taken by nir.
So far I understand that topic all NIR drivers are most likely affected. As mentioned, the problem was fixed around a year ago by Axel Davy for the radeonsi driver and it might be fixed in the near future also for the r600 and eventually the r300 one.
Addition (16.12.2022): The corresponding problem was fixed in the r600 driver with MR !20061 (merged), "Store nir shaders serialized to save memory". It looks that this approach is somehow faster than the other variant applied for radeonsi. So there exists no longer any relevant "waiting time" when the game "A Hat in Time" is launched. It is comparable with the launching via WineD3D which also means that my old iMac 12,2 computer outperforms a much newer Ryzen 7 5700U based Asus AiO system.
An apitrace is also available, see below.
So in the end it looks that all NIR drivers needs a "shader compilation" or a "save memory with NIR shader" tweak. This would also include the iris and the nouveau nv30/nv50/nvc0 drivers.
System information
inxi -GSC -xx
System:
Host: iMac-Urs Kernel: 5.15.0-48-generic x86_64 bits: 64 compiler: gcc
v: 11.2.0 Desktop: KDE Plasma 5.24.6 tk: Qt 5.15.3 wm: kwin_x11 dm: SDDM
Distro: Ubuntu 22.04.1 LTS (Jammy Jellyfish)
CPU:
Info: quad core model: Intel Core i5-2400 bits: 64 type: MCP
arch: Sandy Bridge rev: 7 cache: L1: 256 KiB L2: 1024 KiB L3: 6 MiB
Speed (MHz): avg: 1600 min/max: 1600/3400 cores: 1: 1600 2: 1600 3: 1600
4: 1600 bogomips: 24799
Flags: avx ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx
Graphics:
Device-1: Intel 2nd Generation Core Processor Family Integrated Graphics
vendor: Apple driver: i915 v: kernel ports: active: none empty: VGA-1
bus-ID: 00:02.0 chip-ID: 8086:0102
Device-2: AMD Whistler [Radeon HD 6730M/6770M/7690M XT] vendor: Apple
driver: radeon v: kernel pcie: speed: 2.5 GT/s lanes: 16 ports:
active: eDP-1 empty: DP-1, DP-2, DP-3, DP-4, VGA-2 bus-ID: 01:00.0
chip-ID: 1002:6740
Device-3: Apple FaceTime HD Camera (Built-in) type: USB driver: uvcvideo
bus-ID: 1-2:3 chip-ID: 05ac:850b
Display: x11 server: X.Org v: 1.21.1.3 compositor: kwin_x11 driver: X:
loaded: ati,modesetting,radeon unloaded: fbdev,vesa gpu: radeon
display-ID: :0 screens: 1
Screen-1: 0 s-res: 2560x1440 s-dpi: 96
Monitor-1: eDP res: 2560x1440 dpi: 109 diag: 685mm (27")
OpenGL: renderer: AMD TURKS (DRM 2.50.0 / 5.15.0-48-generic LLVM 14.0.6)
v: 4.5 Mesa 22.3.0-devel (git-27aa172 2022-10-09 jammy-oibaf-ppa)
direct render: Yes
If applicable
- Wine version: 7.17
- Kubuntu 22.04 LTS
Log files as attachment
A Hat in Time r600 Apitrace (15.12.2022)
https://drive.google.com/file/d/1AY8mQbs17eUrSC5SV3jMxPhZjxH5XZbx/view?usp=sharing
=> This apitrace is made via TGSI after !20061 (merged) landed and with "Precache Shaders" option disabled.
A Hat in Time r600 Apitrace (12.09.2022)
https://drive.google.com/file/d/1gMQR_-6fXtb606_nclH44Mt_tQ1i7D5K/view?usp=sharing
=> Original pre !20061 (merged) apitrace. Consumes on r600 through NIR over 32GB of main memory (instead of 4GB with TGSI) mostly because the "Precache Shaders" option is enabled.
Any extra information would be greatly appreciated
The game works with WineD3D but the performance is especially on older systems not so nice.
Addition (16.12.2022): The long loading behavior of a level (which is caused by the pre-compiling of a massive amount of shaders) can be also improved by disabling the Precache Shaders feature in the game. This feature seems to be normally enabled in GoG game build version 59270 although it is noted that the default is "unchecked" ergo disabled. Whatever, I really should have figured that out sooner: