Skip to content

Latest commit

 

History

History
137 lines (121 loc) · 6.73 KB

README.md

File metadata and controls

137 lines (121 loc) · 6.73 KB

slack

Argoproj - Get stuff done with Kubernetes

Argo Image

Quickstart

kubectl create namespace argo
kubectl apply -n argo -f https://raw.githubusercontent.com/argoproj/argo/stable/manifests/install.yaml

What is Argoproj?

Argoproj is a collection of tools for getting work done with Kubernetes.

  • Argo Workflows - Container-native Workflow Engine
  • Argo CD - Declarative GitOps Continuous Delivery
  • Argo Events - Event-based Dependency Manager
  • Argo Rollouts - Deployment CR with support for Canary and Blue Green deployment strategies

Also argoproj-labs is a separate GitHub org that we setup for community contributions related to the Argoproj ecosystem. Repos in argoproj-labs are administered by the owners of each project. Please reach out to us on the Argo slack channel if you have a project that you would like to add to the org to make it easier to others in the Argo community to find, use, and contribute back.

What is Argo Workflows?

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition).

  • Define workflows where each step in the workflow is a container.
  • Model multi-step workflows as a sequence of tasks or capture the dependencies between tasks using a graph (DAG).
  • Easily run compute intensive jobs for machine learning or data processing in a fraction of the time using Argo Workflows on Kubernetes.
  • Run CI/CD pipelines natively on Kubernetes without configuring complex software development products.

Why Argo Workflows?

  • Designed from the ground up for containers without the overhead and limitations of legacy VM and server-based environments.
  • Cloud agnostic and can run on any Kubernetes cluster.
  • Easily orchestrate highly parallel jobs on Kubernetes.
  • Argo Workflows puts a cloud-scale supercomputer at your fingertips!

Documentation

Features

  • DAG or Steps based declaration of workflows
  • Artifact support (S3, Artifactory, HTTP, Git, raw)
  • Step level input & outputs (artifacts/parameters)
  • Loops
  • Parameterization
  • Conditionals
  • Timeouts (step & workflow level)
  • Retry (step & workflow level)
  • Resubmit (memoized)
  • Suspend & Resume
  • Cancellation
  • K8s resource orchestration
  • Exit Hooks (notifications, cleanup)
  • Garbage collection of completed workflow
  • Scheduling (affinity/tolerations/node selectors)
  • Volumes (ephemeral/existing)
  • Parallelism limits
  • Daemoned steps
  • DinD (docker-in-docker)
  • Script steps

Who uses Argo?

As the Argo Community grows, we'd like to keep track of our users. Please send a PR with your organization name.

Currently officially using Argo:

  1. Adevinta
  2. Admiralty
  3. Adobe
  4. Alibaba Cloud
  5. Ant Financial
  6. BasisAI
  7. BioBox Analytics
  8. BlackRock
  9. Canva
  10. Capital One
  11. CCRi
  12. Codec
  13. Commodus Tech
  14. CoreFiling
  15. Cratejoy
  16. CyberAgent
  17. Cyrus Biotechnology
  18. Datadog
  19. DataStax
  20. EBSCO Information Services
  21. Equinor
  22. Fairwinds
  23. Gardener
  24. GitHub
  25. Gladly
  26. Google
  27. Greenhouse
  28. HOVER
  29. IBM
  30. InsideBoard
  31. Interline Technologies
  32. Intuit
  33. Karius
  34. KintoHub
  35. Localytics
  36. Maersk
  37. Max Kelsen
  38. Mirantis
  39. NVIDIA
  40. OVH
  41. Peak AI
  42. Preferred Networks
  43. Quantibio
  44. Ramboll Shair
  45. Red Hat
  46. SAP Fieldglass
  47. SAP Hybris
  48. Sidecar Technologies
  49. Styra
  50. Threekit
  51. Tiger Analytics
  52. Wavefront

Community Blogs and Presentations

Project Resources