Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support job pipelining #160

Open
jawher opened this issue Apr 19, 2015 · 4 comments
Open

Support job pipelining #160

jawher opened this issue Apr 19, 2015 · 4 comments
Assignees

Comments

@jawher
Copy link
Contributor

jawher commented Apr 19, 2015

Make it possible for a job to trigger other jobs after it's finished.

  • A job can list it's downstream jobs, but not the other way around
  • A downstream job is identified by it's name
  • It should be possible for specify the scm revision for a downstream job
  • It should be possible to specify job parameters for a downstream job (Allow injecting values into a job #129)
  • For the parameter passing, it'll be much more useful if we could use dynamic values (environment variables for example) in the yaml (Support environment variables substitution in .bazooka.yml #159)
  • A downstream job must be able to access the artifacts of the previous job

An example:

  • api is a bazooka project for a REST server written in Java
  • deploy-api deploys api to a tomcat server
  • api-perf is another project which benchmarks the performance of api
  • api-it runs integration tests on api
  • ui is an angularjs frontend
  • ui-selenium is the last project which tests the ui screens

ui config:

language: java
jdk: 1.8
archive_success: target/api.war
downstream_success:
  - project: deploy-api
    rev: master
  - project: ui
    rev: master

on success, api job triggers deploy-api and ui.

deploy-api config (which supposes the previous job artifacts are mounted on the /incoming directory:

image: debian:jessie
script:
  - scp /incoming/api.war tomcat-admin:[email protected]:/opt/tomcat/webapps/
  - <restart tomcat>
  - <wait a bit>
downstream_success:
  - project: api-perf
    rev: master
  - project: api-it
    rev: master

etc.

@jawher
Copy link
Contributor Author

jawher commented Apr 19, 2015

Note that this proposed solution is a poor man's pipeline, as it doesn't support some advanced features of a real pipeline with stages like fanning-in (wait for multiple jobs before continuing).

@haklop haklop self-assigned this Apr 19, 2015
@julienvey
Copy link
Member

What about removing pipelines from the yaml file ?

Each bazooka project would have its build configuration in the yaml file, but not the downstream/upstream dependencies. And we implement the pipelines in the Bazooka API.

With this solution, I think it would remove possible errors, such as

  • What happen if the downstream project does not exists
  • If I renamed it ?

And it would also allow us to implement more complex pipelines, fanning-in, human validation...

For simplification, we could consider each downstream jobs gets the artifacts generated by its upstream job in /incoming.

@julienvey
Copy link
Member

A list of useful features for pipelining

  • Non sequential logic
    • Parallel
    • Fork/Join
    • Loops
    • Try/Catch/Finally
  • Timeout
  • Retry
  • Human interactions
  • Restartable builds

We only need to start with sequential workflow (downstream/upstream) and create issues for the other features

A pipeline could be described in a yaml DSL. I will try to add a simple example soon

@julienvey
Copy link
Member

Just an idea

entry_point: 
  - name: api
    bzk_project: api-unit   # (optional) bazooka project name (if different of name)
    triggered_by: scm  # (optional) scm, manual... defaults to scm
    triggers: 
      -  api-perf
      -  api-it
wait_for:
  - wait_jobs: 
      - api-perf
      - api-it
    triggers: 
      - deploy-api

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants