v3d: set 16bit uint return sizes to 32 bits
TMU output can be either 32 bits, or 16 bits. For 16-bit based textures, we configure it to use 16 bits.
But this can fail if the texture is 16-bit uint type, and we read it with a isampler2D(): as the output TMU output is 16-bit signed, it means that we can overflow the result if the read value is too high (> 32767).
In this case, it's better to configure the TMU output as 32-bit.
This fixes spec@!opengl 3.0@gl-3.0-texture-integer
.