Skip to content

Commit

Permalink
Continuous Integration tests up and running (#116)
Browse files Browse the repository at this point in the history
* CI updated for GithubActions

* Update Project.toml

* CI updated with yml file

* yml file path updated

* yaml file extension fixed

* PYTHON variable set to which python

* PYTHON env fixed

* CI updated

* [WIP] Investigating strange behaviour of SIA solver

There's some outflow outside the glacier, potentially due to the staggered grid or some processing which brings some ice outside the main glacier catchment. Need to investigate the SIA code in detail and play with the tolerance.

* solve problem with boundary condition for SIA PDE (no UDE)

* restore old version of action

* [WIP] Mass balance fixed.

Fixed issue with MB, which was not correctly selecting the subperiods for the climate data. To be merged with Facu's fix of the boundary conditions of the SIA.

* New CI and Python environment from Facu

* Updated test reference files

* Forcing intitialization of gdirs from scratch for CI

* CI and environment updated

* CI and environment updated

* Python env and tests fixed

* Update environment.yml

Fixing environment name.

* Update environment.yml

* Update environment.yml

* Update environment.yml

* Update environment.yml

* Update CI.yml

* Update CI.yml

* Test update and environment.yml

* Update CI.yml

* Update CI.yml with CA certificate setup

* CA certificate

* Update CA variable assignment

* Update CI.yml

* Update CI.yml

* Update CI.yml

* Update CI.yml

* Update CI.yml

* SSL certificate added to tests

* Ref files for test updated

* Docs action removed + tests with less glaciers

For now I have deactivated the Documentation action for GitHub, we can provide them in the future once the API is stable. I have also reduced the number of glaciers in the tests from 12 to 5, since it downloads a ton of data on GitHub CI.

* Using 2 workers for tests

* Using 2 workers for tests (now for OGGM)

* Update README with micromamba installation and other details

* Fall back multiprocessing in CI for PDE solving

* Multiprocessing for OGGM fixed

* Avoid downloading Millan22 velocities in CI

* Correctly bypassing download of Millan22 velocities

* Including missing catch on get_initial_status()

---------

Co-authored-by: Facundo Sapienza <[email protected]>
  • Loading branch information
JordiBolibar and facusapienza21 committed Jun 9, 2023
1 parent f46a2b3 commit 5564e48
Show file tree
Hide file tree
Showing 21 changed files with 566 additions and 232 deletions.
101 changes: 85 additions & 16 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
@@ -1,28 +1,97 @@
name: Run tests

name: Run Tests
on:
push:
branches:
- main
tags: ['*']
pull_request:
types: [opened, synchronize, reopened]
schedule:
- cron: '0 0 * * 0'

concurrency:
# Skip intermediate builds: always.
# Cancel intermediate builds: only if it is a pull request build.
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
jobs:
test-github-cpuonly:
env:
test:
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
runs-on: ${{ matrix.os }}
defaults:
run:
shell: bash -el {0}
strategy:
fail-fast: false
matrix:
os: [ubuntu-20.04]
julia-version: ['1.8']
julia-arch: [x64]

version:
- '1.9'
python: [3.9]
os:
- ubuntu-latest
arch:
- x64
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@latest
- name: Set up Python 🐍 ${{ matrix.python }}
uses: actions/setup-python@v2
with:
version: ${{ matrix.julia-version }}
- uses: julia-actions/julia-buildpkg@latest
- uses: julia-actions/julia-runtest@latest
python-version: ${{ matrix.python }}
- name: Create environment with micromamba 🐍🖤
uses: mamba-org/setup-micromamba@v1
with:
micromamba-version: '1.3.1-0'
environment-file: ./environment.yml
environment-name: oggm_env # it is recommendable to add both name and yml file.
init-shell: bash
cache-environment: true
# condarc-file: ./condarc.yml # If necessary, we can include .condarc to configure environment
- name: Test creation of environment with micromamba 🔧🐍🖤
run: |
which python
conda env export
shell: bash -el {0}
- name: Update certifi
run: |
pip install --upgrade certifi
shell: bash -el {0}
# - name: Test OGGM installation 🔧🌎
# run: pytest.oggm
# shell: bash -el {0}
- name: Set ENV Variables for PyCall.jl 🐍 📞
run: export PYTHON=/home/runner/micromamba/envs/oggm_env/bin/python
shell: bash -el {0}
- uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.version }}
arch: ${{ matrix.arch }}
- uses: julia-actions/cache@v1
with:
cache-registries: "true"
- uses: julia-actions/julia-buildpkg@v1
env:
PYTHON : /home/runner/micromamba/envs/oggm_env/bin/python
- uses: julia-actions/julia-runtest@v1
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v2
with:
files: lcov.info
# docs:
# name: Documentation
# runs-on: ubuntu-latest
# permissions:
# contents: write
# statuses: write
# steps:
# - uses: actions/checkout@v2
# - uses: julia-actions/setup-julia@v1
# with:
# version: '1.9'
# - uses: julia-actions/julia-buildpkg@v1
# env:
# PYTHON : /home/runner/micromamba/envs/oggm_env/bin/python
# - uses: julia-actions/julia-docdeploy@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# - run: |
# julia --project=docs -e '
# using Documenter: DocMeta, doctest
# using ODINN
# DocMeta.setdocmeta!(ODINN, :DocTestSetup, :(using ODINN); recursive=true)
# doctest(ODINN)'
16 changes: 16 additions & 0 deletions .github/workflows/CompatHelper.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
name: CompatHelper
on:
schedule:
- cron: 0 0 * * *
workflow_dispatch:
jobs:
CompatHelper:
runs-on: ubuntu-latest
steps:
- name: Pkg.add("CompatHelper")
run: julia -e 'using Pkg; Pkg.add("CompatHelper")'
- name: CompatHelper.main()
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
COMPATHELPER_PRIV: ${{ secrets.DOCUMENTER_KEY }}
run: julia -e 'using CompatHelper; CompatHelper.main()'
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ SnoopPrecompile = "1.0.3"
TimerOutputs = "0.5.22"
Tullio = "0.3"
Zygote = "0.6"
julia = "1.8"
julia = "1.7"

[extras]
CPUSummary = "2a0fbf3d-bb9c-48f3-b0a9-814d99fd7ab9"
Expand Down
28 changes: 7 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,7 @@
# ODINN

<!---
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://JordiBolibar.github.io/ODINN.jl/stable)
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://JordiBolibar.github.io/ODINN.jl/dev)
[![Build Status](https://github.com/JordiBolibar/ODINN.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/JordiBolibar/ODINN.jl/actions/workflows/CI.yml?query=branch%3Amain)
[![Build Status](https://travis-ci.com/JordiBolibar/ODINN.jl.svg?branch=main)](https://travis-ci.com/JordiBolibar/ODINN.jl)
[![Coverage](https://codecov.io/gh/JordiBolibar/ODINN.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/JordiBolibar/ODINN.jl)
-->

<img src="https://github.com/ODINN-SciML/odinn_toy/blob/main/plots/ODINN_logo_final.png" width="250">

Expand All @@ -25,21 +20,22 @@ Global glacier model using Universal Differential Equations to model and discove
In order to install `ODINN` in a given environment, just do in the REPL:
```julia
julia> ] # enter Pkg mode
(@v1.8) pkg> activate MyEnvironment # or activate whatever path for the Julia environment
(@v1.9) pkg> activate MyEnvironment # or activate whatever path for the Julia environment
(MyEnvironment) pkg> add ODINN
```

## ODINN initialization: integration with OGGM and multiprocessing

In order to call OGGM in Python from Julia, a Python installation is needed, which then can be used in Julia using [PyCall](https://github.com/JuliaPy/PyCall.jl). We recommend splitting the Julia (i.e. ODINN) and Python (i.e. OGGM) files in separate folders, which we chose to name `Julia` and `Python`, both placed at root level. As indicated in the [OGGM documentation](https://docs.oggm.org/en/stable/installing-oggm.html), when installing OGGM it is best to create a new dedicated conda environment for it (e.g. `oggm_env`). In the same environment, install also the [OGGM Mass-Balance sandbox](https://github.com/OGGM/massbalance-sandbox) following the instructions in the repository.

The path to this conda environment needs to be specified in the `ENV["PYTHON"]` variable in Julia, for PyCall to find it. This configuration is very easy to implement, it just requires activating the conda environment before the first time you run ODINN in your machine. In the terminal (not in a Julia session), run:
ODINN depends on some Python packages, mainly OGGM and xarray. In order to install the necessary Python dependencies in an easy manner, we are providing a Python environment (`oggm_env`) in `environment.yml`. In order to install it and activate it, we recommend using micromamba:

```
conda activate oggm_env # replace `oggm_env` with whatever conda environment where you have installed OGGM and the MBSandbox
micromamba create -f environment.yml
micromamba activate oggm_env
```

Then, you need to configure PyCall to use the Python path for that conda environment:
In order to call OGGM in Python from Julia, we use [PyCall.jl](https://github.com/JuliaPy/PyCall.jl). PyCall hooks on the Python installation and allows using Python in a totally seamless way from Julia.

The path to this conda environment needs to be specified in the `ENV["PYTHON"]` variable in Julia, for PyCall to find it. This configuration is very easy to implement, it just requires providing the Python path to PyCall and building it:

```julia
julia # start Julia session
Expand All @@ -65,16 +61,6 @@ From this point, it is possible to use ODINN with multiprocessing and to run Pyt

ODINN works as a back-end of OGGM, utilizing all its tools to retrieve RGI data, topographical data, climate data and other datasets from the OGGM shop. We use these data to specify the initial state of the simulations, and to retrieve the climate data to force the model. Everything related to the mass balance and ice flow dynamics models is written 100% in Julia. This allows us to run tests with this toy model for any glacier on Earth. In order to choose a glacier, you just need to specify the RGI ID, which you can find [here](https://www.glims.org/maps/glims).

## Running the toy model

A demostration with a toy model is showcased in [`src/scripts/toy_model.jl`](https://github.com/ODINN-SciML/ODINN.jl/blob/main/scripts/toy_model.jl). The `Project.toml` includes all the required dependencies. If you are running this code from zero, you may need to install the libraries using `Pkg.instantiate()`. In case you want to include this package to the project manifest, you can also use `Pkg.resolve()` before instantiating the project. You can replace the preamble in `src/scripts/toy_model.jl` to

```julia
import Pkg
Pkg.activate(dirname(Base.current_project()))
Pkg.instantiate()
Pkg.precompile()
```
## Upcoming changes

A stable API is still being designed, which will be available in the next release. If you plan to start using the model, please contact us, although we recommend to wait until next release for a smoother experience.
41 changes: 41 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
name: oggm_env
channels:
- conda-forge
dependencies:
- python=3.9
- jupyter
- jupyterlab
- numpy
- scipy
- pandas
- shapely
- matplotlib
- Pillow
- netcdf4
- scikit-image
- scikit-learn
- configobj
- xarray
- pytest
- dask
- bottleneck
- gdal=3.3
- pyproj
- cartopy
- geopandas
- rasterio
- rioxarray
- pytables
- salem
- motionless
- ipython
- numpydoc
- seaborn
- pip
- pip:
- joblib
- progressbar2
- git+https://github.com/OGGM/pytest-mpl
- git+https://github.com/OGGM/massbalance-sandbox
- oggm==1.6.0
- certifi
100 changes: 100 additions & 0 deletions scripts/dhdt_plots.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@

# using Plots; gr()
using CairoMakie
using JLD2
import ODINN: fillZeros


function make_plots()

# plot_type = "only_H" # plot final H
# plot_type = "MB_diff" # differences between runs with different MB models
plot_type = "H_diff" # H - H₀
tspan = (2010.0, 2015.0) # period in years for simulation

root_dir = dirname(Base.current_project())

# Load forward simulations with different surface MB
grefs = load(joinpath(root_dir, "data/gdir_refs_$tspan.jld2"))["gdir_refs"]
grefs_MBu1 = load(joinpath(root_dir, "data/gdir_refs_updatedMB1.jld2"))["gdir_refs"]

n=4
m=3
hms_MBdiff, MBdiffs = [], []
figMB = Figure(resolution = (900, 1100))
axsMB = [Axis(figMB[i, j]) for i in 1:n, j in 1:m]
hidedecorations!.(axsMB)
tightlimits!.(axsMB)
let label=""
for (i, ax) in enumerate(axsMB)
ax.aspect = DataAspect()
name = grefs[i]["RGI_ID"]
ax.title = name
H = reverse(grefs[i]["H"]', dims=2)
H₀ = reverse(grefs[i]["H₀"]', dims=2)
H_MBu1 = reverse(grefs_MBu1[i]["H"]', dims=2)
# H = reverse(grefs[i]["H"])
# H_MBu1 = reverse(grefs_MBu1[i]["H"])
if plot_type == "only_H"
H_plot = H
label = "Predicted H (m)"
elseif plot_type == "H_diff"
H_plot = H .- H₀
label = "H - H₀ (m)"
elseif plot_type == "MB_diff"
H_plot = H .- H_MBu1
label="Surface mass balance difference (m)"
end
push!(MBdiffs, H_plot)
push!(hms_MBdiff, CairoMakie.heatmap!(ax, fillZeros(H_plot), colormap=:inferno))
end

minMBdiff = minimum(minimum.(MBdiffs))
maxMBdiff = maximum(maximum.(MBdiffs))
foreach(hms_MBdiff) do hm
hm.colorrange = (minMBdiff, maxMBdiff)
end
Colorbar(figMB[2:3,m+1], limits=(minMBdiff/2,maxMBdiff/2), label=label, colormap=:inferno)
#Label(figH[0, :], text = "Glacier dataset", textsize = 30)
if plot_type == "only_H"
Makie.save(joinpath(root_dir, "plots/MB/H_MB_$tspan.pdf"), figMB, pt_per_unit = 1)
elseif plot_type == "H_diff"
Makie.save(joinpath(root_dir, "plots/MB/H_diff_wMB_$tspan.pdf"), figMB, pt_per_unit = 1)
elseif plot_type == "MB_diff"
Makie.save(joinpath(root_dir, "plots/MB/diffs_noMB_$tspan.pdf"), figMB, pt_per_unit = 1)
end

end # let

# hms = []
# for (gref, gref_MBu1) in zip(grefs, grefs_MBu1)
# H = reverse(gref["H"], dims=1)
# H_MBu1 = reverse(gref_MBu1["H"], dims=1)
# # H = gref["H"]
# # H_MBu1 = gref_MBu1["H"]
# push!(hms, heatmap(H .- H_MBu1,
# clims=(0.0,5.0),
# ylimits=(0, size(H)[1]),
# xlimits=(0, size(H)[2]),
# colorbar = false)
# )
# end

# h2 = scatter([0,0], [0,1], clims=(0.0,5.0),
# xlims=(1,1.1), xshowaxis=false, yshowaxis=false, label="", colorbar_title="cbar", grid=false)


# l = @layout [grid(6,5) a{0.01w}]

# # Create the combined plot with the subplots and shared colormap
# p_dhdt = plot(hms..., h2,
# size=(1800, 1200),
# layout=l,
# link=:all,
# aspect_ratio=:equal)

# savefig(p_dhdt, joinpath(root_dir, "plots/MB/dhdt_MB_1"))

end

make_plots()
Loading

0 comments on commit 5564e48

Please sign in to comment.