Skip to content

To ensure developers can get the most out of our performance-leading hardware, we built the Voyager™ SDK which facilitates the development of high-performance applications.

License

Notifications You must be signed in to change notification settings

axelera-ai-hub/voyager-sdk

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

image

Voyager SDK repository

v1.5: Release notes

The Voyager SDK makes it easy to build high-performance inferencing applications with Axelera AI Metis devices. The sections below provide links to code examples, tutorials and reference documentation.

Important

This is a production-ready release of Voyager SDK. Software components and features that are in development are marked "[Beta]" indicating tested functionality that will continue to grow in future releases or "[Experimental]" indicating early-stage feature with limited testing.

Install SDK and get started

Document Description
Installation guide Explains how to setup the Voyager SDK repository and toolchain on your development system
Quick start guide Explains how to deploy and run your first model
Windows getting started guide Explains how to install Voyager SDK and run a model in Windows 11
AxDevice manual AxDevice is a tool that lists all Metis boards connected to your system and can configure their settings
Board firmware update guide Explains how to update your board firmware (for customers with older boards who have received instructions)

Deploy models on Metis devices

Document Description
Model zoo Lists all models supported by this release of the Voyager SDK
Deployment manual (deploy.py) Explains all options provided by the command-line deployment tool
Custom weights tutorial Explains how to deploy a model using your own weights
Custom model tutorial Explains how to deploy a custom model

Run models on Metis devices

Document Description
Benchmarking guide Explains how to measure end-to-end performance and accuracy
Inferencing manual (inference.py) Explains all options provided by command-line interencing tool
Application integration tutorial (high level) Explains how to integrate a YAML pipeline within your application
Application integration tutorial (low level) Explains how to integrate an AxInferenceNet model within your application

Application integration APIs

The Voyager SDK allows you to develop inferencing pipelines and end-user applications at different levels of abstraction.

API Description
InferenceStream (high level) Library for directly reading pipeline image and inference metadata from within your application
AxInferenceNet (middle level) C/C++ API reference for integrating model inferencing and pipeline construction directly within an application
AxRuntime (low level) Python and C/C++ APIs for manually constructing, configuring and executing pipelines
GStreamer Plugins for integrating Metis inferencing within a GStreamer pipeline

The InferenceStream library is the easiest to use and enables most users to achieve the highest performance. The lower-level APIs enable expert users to integrate Metis within existing video streaming frameworks.

Reference pipelines

The Voyager SDK makes it easy to construct pipelines that combine multiple models in different ways. A number of end-to-end reference pipelines are provided, which you can use as templates for your own projects.

Directory Description
/ax_models/reference/parallel Multiple pipelines running in parallel
/ax_models/reference/cascade Cascaded pipelines in which the output of one model is input to a secondary model
/ax_models/reference/cascade/with_tracker Cascaded pipelines in which the output of the first model is tracked prior to being input to a secondary model
/ax_models/reference/image_preprocess Pipelines in which the camera input is first preprocessed prior to being used for inferencing

Additional documentation

This section provides links to additional documentation available in the Voyager SDK repository.

Document Description
Advanced deployment tutorials Advanced deployment options [experimental]
AxRunmodel manual AxRunModel is a tool that can run deployed models on Metis hardware using different features available in the AxRuntime API (such as DMA buffers, double buffering and multiple cores)
Compiler CLI Compiler Command Line Interface [beta]
Compiler API Python Compiler API [experimental]
ONNX operator support List of ONNX operators supported by the Axelera AI compiler
Thermal Guide Document detailing the thermal behavior for Metis and instructions to make changes
SLM/LLM inference tutorial Explains how to run Language Models on Metis devices [experimental]

Further support

About

To ensure developers can get the most out of our performance-leading hardware, we built the Voyager™ SDK which facilitates the development of high-performance applications.

Resources

License

Stars

Watchers

Forks

Packages

No packages published