Skip to content

gst-inference: upstream

Summary

gst-inference is a GStreamer deep learning inference framework. Documentation can be found here.

It currently supports tensorflow, ncsdk, tensorflow lite, EdgeTPU and ONNX neural networks.

gst-inference depends on the r2inference library which has backends for various neural network toolkits.

Build instructions for r2inference can be found here.

Testing

There are test examples provided in the tests/examples repository.

Prerequisites

  1. neural network framework backend - currently either tensorflow, tf-lite or ncsdk.
  2. a model from the zoo

Here is an example that classifies objects with the InceptionV4 NN model:

#Inference bin parameters
ARCH='tinyyolov2'
BACKEND='tensorflow'
MODEL_LOCATION='graph_tinyyolov2_tensorflow.pb'
INPUT='input/Placeholder'
OUTPUT='add_8'
LABELS='labels.txt'
CROP=false
OVERLAY=true
FILTER=-1

gst-launch-1.0 v4l2src ! inferencebin arch=$ARCH model-location=$MODEL_LOCATION backend=$BACKEND input-layer=$INPUT output-layer=$OUTPUT \
labels=$LABELS crop=$CROP overlay=$OVERLAY filter=$FILTER  ! videoconvert ! ximagesink sync=false

This merge request is a continuation of a previous one.

Merge request reports