radeonsi: `AMD_DEBUG=useaco` breaks shader disk cache
System information
- OS: Arch Linux
- GPU: AMD Radeon RX 6700 XT
- Kernel version: Linux 6.5.6-lqx1-1-lqx
- Mesa version: Mesa 23.3.0-devel (git-1e820ac1)
Describe the issue
If the first OpenGL application is launched with AMD_DEBUG=useaco
when the shader disk cache is created, all applications that use LLVM break. The same thing happens in reverse, if the first application is launched with LLVM it breaks ACO.
MESA_SHADER_CACHE_DIR=test_cache glxgears
MESA_SHADER_CACHE_DIR=test_cache AMD_DEBUG=useaco glxgears
Now if the cache is removed with rm -R test_cache
MESA_SHADER_CACHE_DIR=test_cache AMD_DEBUG=useaco glxgears
MESA_SHADER_CACHE_DIR=test_cache glxgears
Regression
Yes, I haven't bisected but it used to work a few months ago.
Logs
This is printed when the output is corrupted:
ac_rtld error: !ehdr
ELF error: invalid `Elf' handle
LLVM failed to upload shader
EE ../../../../../mesa/src/gallium/drivers/radeonsi/si_state_shaders.cpp:2491 si_build_shader_variant - Failed to build shader variant (type=0)
ac_rtld error: !ehdr
ELF error: invalid `Elf' handle
LLVM failed to upload shader
EE ../../../../../mesa/src/gallium/drivers/radeonsi/si_state_shaders.cpp:2491 si_build_shader_variant - Failed to build shader variant (type=0)
Edited by Hannes Mann