Skip to content
#

inference

Here are 27 public repositories matching this topic...

EdgeInfer enables efficient edge intelligence by running small AI models, including embeddings and OnnxModels, on resource-constrained devices like Android, iOS, or MCUs for real-time decision-making. EdgeInfer 旨在资源受限的设备上运行小型 AI 模型(包括向量化和 Onnx 模型),如 Android、iOS 或 MCUs,实现高效的边缘智能,用于实时决策。

  • Updated Apr 17, 2024
  • Rust

Improve this page

Add a description, image, and links to the inference topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the inference topic, visit your repo's landing page and select "manage topics."

Learn more