etnaviv: Hook up fused ReLU activation
ReLU is a no-op if the output tensor quantization already clamps min values to 0, but that is not guaranteed by the tflite format.
Enable configuring fused ReLU activation in convolution operations to be sure.
Based on !31842 (merged).