To get started with Inference DS, follow the installation instructions first and make sure you have an Inference DS Core running at your local machine or as a remote instance. Furthermore, download the Inference DS User Interface.

Remote Connection
In case you want to connect to a remote instance of the Inference DS Core, make sure to accept connections on port 6716 at the remote site and configure the Core to accept remote connections. To do that, open ~/.inds/settings.yml and add this line for the server configuration:
http_server: { host: }

Once you started the Inference DS User Interface, you do not yet see any data sources in the view tab, as nothing is configured yet:

Head over to the configuration tab where we will set up our first processing pipeline. We’re going to create a pipeline consisting of one node reading images from a directory and another one that performs object detection on these images. You can download some example images and an exemplary object detection model here:



Let’s start with creating the file input node by selecting “add input“ and specifying file_input as the input type. Add a name, specify the directory containing the images, and select the file type “image“. In addition, we let the node read the images again by selecting the flag “repeat“. Furthermore, we set a small delay to make viewing the detection results more appealing.

Once the input is created, go to “models” and add a model. Here, you need to specify the path to the network model file model.onnx, as well as to model.parameters. As soon as the model is created, we can connect the file input node to the model via the connections menu of the model node.

Now, open the control tab and start the nodes. The file input now reads images from the specified directory, sends them to the model which does the object detection. When opening the view tab, you can now select the outputs of the file input node and of the model node. You can view the detections and adjust the selectivity. This value acts as a threshold on the object confidence below which objects are discarded.

Congratulations! You created your first pipeline and which makes you ready to add plugins or output nodes to the pipeline. The current configuration can be exported and imported to other Cores which facilitates the setup of larger numbers of systems. Learn more about configurations in the section Configuration and how to write custom plugins based on Python following the guide on Plugin Development.