Skip to content

etnaviv: Hook up fused ReLU activation

Philipp Zabel requested to merge pH5/mesa:imx8mp-fused-relu into main

ReLU is a no-op if the output tensor quantization already clamps min values to 0, but that is not guaranteed by the tflite format.

Enable configuring fused ReLU activation in convolution operations to be sure.

Based on !31842 (merged).

@tomeu

Merge request reports

Loading