Skip to content

Latest commit

 

History

History
135 lines (119 loc) · 6.65 KB

README.md

File metadata and controls

135 lines (119 loc) · 6.65 KB

slack

Argoproj - Get stuff done with Kubernetes

Argo Image

Quickstart

kubectl create namespace argo
kubectl apply -n argo -f https://raw.githubusercontent.com/argoproj/argo/stable/manifests/install.yaml

What is Argoproj?

Argoproj is a collection of tools for getting work done with Kubernetes.

  • Argo Workflows - Container-native Workflow Engine
  • Argo CD - Declarative GitOps Continuous Delivery
  • Argo Events - Event-based Dependency Manager
  • Argo Rollouts - Deployment CR with support for Canary and Blue Green deployment strategies

Also argoproj-labs is a separate GitHub org that we setup for community contributions related to the Argoproj ecosystem. Repos in argoproj-labs are administered by the owners of each project. Please reach out to us on the Argo slack channel if you have a project that you would like to add to the org to make it easier to others in the Argo community to find, use, and contribute back.

What is Argo Workflows?

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition).

  • Define workflows where each step in the workflow is a container.
  • Model multi-step workflows as a sequence of tasks or capture the dependencies between tasks using a graph (DAG).
  • Easily run compute intensive jobs for machine learning or data processing in a fraction of the time using Argo Workflows on Kubernetes.
  • Run CI/CD pipelines natively on Kubernetes without configuring complex software development products.

Why Argo Workflows?

  • Designed from the ground up for containers without the overhead and limitations of legacy VM and server-based environments.
  • Cloud agnostic and can run on any Kubernetes cluster.
  • Easily orchestrate highly parallel jobs on Kubernetes.
  • Argo Workflows puts a cloud-scale supercomputer at your fingertips!

Documentation

Features

  • DAG or Steps based declaration of workflows
  • Artifact support (S3, Artifactory, HTTP, Git, raw)
  • Step level input & outputs (artifacts/parameters)
  • Loops
  • Parameterization
  • Conditionals
  • Timeouts (step & workflow level)
  • Retry (step & workflow level)
  • Resubmit (memoized)
  • Suspend & Resume
  • Cancellation
  • K8s resource orchestration
  • Exit Hooks (notifications, cleanup)
  • Garbage collection of completed workflow
  • Scheduling (affinity/tolerations/node selectors)
  • Volumes (ephemeral/existing)
  • Parallelism limits
  • Daemoned steps
  • DinD (docker-in-docker)
  • Script steps

Who uses Argo?

As the Argo Community grows, we'd like to keep track of our users. Please send a PR with your organization name.

Currently officially using Argo:

  1. Adevinta
  2. Admiralty
  3. Adobe
  4. Alibaba Cloud
  5. Ant Financial
  6. BioBox Analytics
  7. BlackRock
  8. Canva
  9. Capital One
  10. CCRi
  11. Codec
  12. Commodus Tech
  13. CoreFiling
  14. Cratejoy
  15. CyberAgent
  16. Cyrus Biotechnology
  17. Datadog
  18. DataStax
  19. EBSCO Information Services
  20. Equinor
  21. Fairwinds
  22. Gardener
  23. GitHub
  24. Gladly
  25. Google
  26. Greenhouse
  27. HOVER
  28. IBM
  29. InsideBoard
  30. Interline Technologies
  31. Intuit
  32. Karius
  33. KintoHub
  34. Localytics
  35. Maersk
  36. Max Kelsen
  37. Mirantis
  38. NVIDIA
  39. OVH
  40. Peak AI
  41. Preferred Networks
  42. Quantibio
  43. Red Hat
  44. SAP Fieldglass
  45. SAP Hybris
  46. Sidecar Technologies
  47. Styra
  48. Threekit
  49. Tiger Analytics
  50. Wavefront

Community Blogs and Presentations

Project Resources