nir/opt_16b_tex_image: Sign extension should matter for texel buffer txf
What does this MR do and why?
Texel buffer could be arbitrary large, so the assumption being made in the following comment is wrong:
"Zero-extension (u16) and sign-extension (i16) have
the same behavior here - txf returns 0 if bit 15 is set
because it's out of bounds and the higher bits don't matter."
Sign extension should matter for GLSL_SAMPLER_DIM_BUF
.
This fixes the case of doing texelFetch
with u16 offset:
uniform itextureBuffer s1;
uint16_t offset = some_ssbo.offset;
value = texelFetch(s1, offset).x;
If the offset is higher than s16 optimization incorrectly left it as 16b.
In spirv the above glsl is translated into:
%22 = OpLoad %ushort %21
%23 = OpUConvert %uint %22
%24 = OpBitcast %int %23
%26 = OpImageFetch %v4int %16 %24
Cc: mesa-stable
Example reproducer for Turnip:
#!amber
SHADER compute compute_shader GLSL
#version 450
#extension GL_EXT_shader_explicit_arithmetic_types_int16 : require
#extension GL_EXT_shader_16bit_storage : require
layout(set = 0, binding = 0) buffer block {
int value;
u16vec2 offset;
};
layout(set = 0, binding = 1) uniform itextureBuffer s1;
void
main()
{
value = texelFetch(s1, offset.x).x;
}
END
BUFFER buf_in DATA_TYPE R16G16_SINT SIZE 65536 FILL 7
BUFFER buf_out DATA_TYPE uint16 DATA 0 0 32768 32769 END
PIPELINE compute pipeline
ATTACH compute_shader
BIND BUFFER buf_out AS storage DESCRIPTOR_SET 0 BINDING 0
BIND BUFFER buf_in AS uniform_texel_buffer DESCRIPTOR_SET 0 BINDING 1
END
RUN pipeline 1 1 1
EXPECT buf_out IDX 0 EQ 7