Skip to content
/ Yatai Public
forked from bentoml/Yatai

Run BentoML workflow at scale on Kubernetes 🦄️

License

Notifications You must be signed in to change notification settings

parano/Yatai

 
 

Repository files navigation

🦄️ Yatai: Model serving at scale on Kubernetes

actions_status join_slack

Yatai makes it easy to deploy and operate machine learning serving workload at scale on Kubernetes. Yatai is built upon BentoML, the unified model serving framework.

Core features:

  • Bento Registry - manage all your team's Bentos and Models, backed by cloud blob storage(S3, MinIO)
  • Deployment Automation - deploy Bentos as auto-scaling API endpoints on Kubernetes and easily rollout new versions
  • Observability - monitoring dashboard helping users to identify model performance issues
  • CI/CD - flexible APIs for integrating with your training and CI pipelines

yatai-overview-page

See more product screenshots yatai-deployment-creation yatai-bento-repos yatai-model-detail yatai-cluster-components yatai-deployment-details yatai-activities

Why Yatai

  • Yatai is built upon BentoML, the unified model serving framework that is high-performing and feature-rich
  • Yatai focus on the model serving and deployment part of your MLOps stack, works well with any ML training/monitoring platforms, such as AWS SageMaker or MLFlow
  • Yatai is Kubernetes native, integrates well with other cloud native tools in the K8s eco-system
  • Yatai is human-centric, provides easy-to-use Web UI and APIs for ML scientists, MLOps engineers, and project managers

Getting Started

  1. Create an ML Service with BentoML following the Quickstart Guide or sample projects in the BentoML Gallery.

  2. Install Yatai locally with Minikube.

    • Prerequisites:
    • Start a minikube Kubernetes cluster: minikube start --cpus 4 --memory 4096
    • Install Yatai Helm Chart:
      helm repo add yatai https://bentoml.github.io/yatai-chart
      helm repo update
      helm install yatai yatai/yatai -n yatai-system --create-namespace
    • Wait for installation to complete: helm status yatai -n yatai-system
    • Start minikube tunnel for accessing Yatai UI: minikube tunnel
    • Open browser at https://yatai.127.0.0.1.sslip.io and login with default account admin:admin
  3. See Administrator's Guide for a comprehensive overview for deploying and configuring Yatai for production use.

Community

Contributing

There are many ways to contribute to the project:

  • If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
  • Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
  • Investigate bugs and reviewing other developer's pull requests.
  • Contributing code or documentation to the project by submitting a Github pull request. See the development guide.

Licence

Elastic License 2.0 (ELv2)

About

Run BentoML workflow at scale on Kubernetes 🦄️

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 48.4%
  • Go 45.0%
  • CSS 3.6%
  • PLpgSQL 1.1%
  • Smarty 0.4%
  • HTML 0.4%
  • Other 1.1%