Inference DS (INDS) is the central execution environment for deep learning models trained with Deep Learning DS. Inference DS comprises interfaces to many 2D- and 3D cameras, plugins for pre- and post-processing, and provides the possibility to write individual plugins in Python. Inference DS is split into two parts: the Inference DS Core for model execution and the Inference DS user interface to configure your processing pipeline and showing the results. It is possible to run the Core completely headless on servers and even embedded devices such as the Nvidia Jetson platform.

Setup your system by following the installation guide and get started using Inference DS to execute your deep neural networks.