Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into apidoc_improvements
Browse files Browse the repository at this point in the history
* upstream/main:
  Updated environment lockfiles (SciTools#5270)
  Drop python3.8 support (SciTools#5269)
  build wheel from sdist, not src (SciTools#5266)
  Lazy netcdf saves (SciTools#5191)
  move setup.cfg to pyproject.toml (SciTools#5262)
  Support Python 3.11 (SciTools#5226)
  Remove Resolve test workaround (SciTools#5267)
  add missing whatsnew entry (SciTools#5265)
  • Loading branch information
tkknight committed Apr 22, 2023
2 parents f7987eb + 6cecf04 commit 79d93b3
Show file tree
Hide file tree
Showing 54 changed files with 1,880 additions and 580 deletions.
49 changes: 49 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
[flake8]
# References:
# https://flake8.readthedocs.io/en/latest/user/configuration.html
# https://flake8.readthedocs.io/en/latest/user/error-codes.html
# https://pycodestyle.readthedocs.io/en/latest/intro.html#error-codes

max-line-length = 80
max-complexity = 50
select = C,E,F,W,B,B950
ignore =
# E203: whitespace before ':'
E203,
# E226: missing whitespace around arithmetic operator
E226,
# E231: missing whitespace after ',', ';', or ':'
E231,
# E402: module level imports on one line
E402,
# E501: line too long
E501,
# E731: do not assign a lambda expression, use a def
E731,
# W503: line break before binary operator
W503,
# W504: line break after binary operator
W504,
exclude =
#
# ignore the following directories
#
.eggs,
build,
docs/src/sphinxext/*,
tools/*,
benchmarks/*,
#
# ignore auto-generated files
#
_ff_cross_refrences.py,
std_names.py,
um_cf_map.py,
#
# ignore third-party files
#
gitwash_dumper.py,
#
# convenience imports
#
lib/iris/common/__init__.py
8 changes: 4 additions & 4 deletions .github/workflows/ci-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,18 +35,18 @@ jobs:
fail-fast: false
matrix:
os: ["ubuntu-latest"]
python-version: ["3.10"]
python-version: ["3.11"]
session: ["doctest", "gallery", "linkcheck"]
include:
- os: "ubuntu-latest"
python-version: "3.10"
python-version: "3.11"
session: "tests"
coverage: "--coverage"
- os: "ubuntu-latest"
python-version: "3.9"
python-version: "3.10"
session: "tests"
- os: "ubuntu-latest"
python-version: "3.8"
python-version: "3.9"
session: "tests"

env:
Expand Down
6 changes: 2 additions & 4 deletions .github/workflows/ci-wheels.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,7 @@ jobs:
- name: "building"
shell: bash
run: |
# require build with explicit --sdist and --wheel in order to
# get correct version associated with sdist and bdist artifacts
pipx run build --sdist --wheel
pipx run build
- uses: actions/upload-artifact@v3
with:
Expand All @@ -54,7 +52,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.9", "3.10", "3.11"]
session: ["wheel"]
env:
ENV_NAME: "ci-wheels"
Expand Down
1 change: 0 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,6 @@ repos:
hooks:
- id: flake8
types: [file, python]
args: [--config=./setup.cfg]

- repo: https://github.com/pycqa/isort
rev: 5.12.0
Expand Down
4 changes: 3 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@ prune docs
prune etc
recursive-include lib *.cdl *.cml *.json *.md *.py *.template *.txt *.xml
prune requirements
recursive-include requirements *.txt
prune tools
exclude .flake8
exclude .git-blame-ignore-revs
exclude .git_archival.txt
exclude .gitattributes
Expand All @@ -20,8 +22,8 @@ exclude Makefile
exclude noxfile.py

# files required to build iris.std_names module
include tools/generate_std_names.py
include etc/cf-standard-name-table.xml
include tools/generate_std_names.py

global-exclude *.py[cod]
global-exclude __pycache__
2 changes: 1 addition & 1 deletion benchmarks/asv.conf.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
// * No build-time environment variables.
// * Is run in the same environment as the ASV install itself.
"delegated_env_commands": [
"PY_VER=3.10 nox --envdir={conf_dir}/.asv/env/nox01 --session=tests --install-only --no-error-on-external-run --verbose"
"PY_VER=3.11 nox --envdir={conf_dir}/.asv/env/nox01 --session=tests --install-only --no-error-on-external-run --verbose"
],
// The parent directory of the above environment.
// The most recently modified environment in the directory will be used.
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/bm_runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def _prep_data_gen_env() -> None:
"""

root_dir = BENCHMARKS_DIR.parent
python_version = "3.10"
python_version = "3.11"
data_gen_var = "DATA_GEN_PYTHON"
if data_gen_var in environ:
print("Using existing data generation environment.")
Expand Down
42 changes: 40 additions & 2 deletions docs/src/userguide/real_and_lazy_data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@

import dask.array as da
import iris
from iris.cube import CubeList
import numpy as np


Expand Down Expand Up @@ -227,10 +228,47 @@ coordinates' lazy points and bounds:
Dask Processing Options
-----------------------

Iris uses dask to provide lazy data arrays for both Iris cubes and coordinates,
and for computing deferred operations on lazy arrays.
Iris uses `Dask <https://docs.dask.org/en/stable/>`_ to provide lazy data arrays for
both Iris cubes and coordinates, and for computing deferred operations on lazy arrays.

Dask provides processing options to control how deferred operations on lazy arrays
are computed. This is provided via the ``dask.set_options`` interface. See the
`dask documentation <http:https://dask.pydata.org/en/latest/scheduler-overview.html>`_
for more information on setting dask processing options.


.. _delayed_netcdf_save:

Delayed NetCDF Saving
---------------------

When saving data to NetCDF files, it is possible to *delay* writing lazy content to the
output file, to be performed by `Dask <https://docs.dask.org/en/stable/>`_ later,
thus enabling parallel save operations.

This works in the following way :
1. an :func:`iris.save` call is made, with a NetCDF file output and the additional
keyword ``compute=False``.
This is currently *only* available when saving to NetCDF, so it is documented in
the Iris NetCDF file format API. See: :func:`iris.fileformats.netcdf.save`.

2. the call creates the output file, but does not fill in variables' data, where
the data is a lazy array in the Iris object. Instead, these variables are
initially created "empty".

3. the :meth:`~iris.save` call returns a ``result`` which is a
:class:`~dask.delayed.Delayed` object.

4. the save can be completed later by calling ``result.compute()``, or by passing it
to the :func:`dask.compute` call.

The benefit of this, is that costly data transfer operations can be performed in
parallel with writes to other data files. Also, where array contents are calculated
from shared lazy input data, these can be computed in parallel efficiently by Dask
(i.e. without re-fetching), similar to what :meth:`iris.cube.CubeList.realise_data`
can do.

.. note::
This feature does **not** enable parallel writes to the *same* NetCDF output file.
That can only be done on certain operating systems, with a specially configured
build of the NetCDF C library, and is not supported by Iris at present.
31 changes: 28 additions & 3 deletions docs/src/whatsnew/latest.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,10 @@ This document explains the changes made to Iris for this release
✨ Features
===========

#. N/A
#. `@pp-mo`_ and `@lbdreyer`_ supported delayed saving of lazy data, when writing to
the netCDF file format. See : :ref:`delayed netCDF saves <delayed_netcdf_save>`.
Also with significant input from `@fnattino`_.
(:pull:`5191`)


🐛 Bugs Fixed
Expand Down Expand Up @@ -60,7 +63,11 @@ This document explains the changes made to Iris for this release
🔗 Dependencies
===============

#. N/A
#. `@rcomer`_ and `@bjlittle`_ (reviewer) added testing support for python
3.11. (:pull:`5226`)

#. `@rcomer`_ dropped support for python 3.8, in accordance with the NEP29_
recommendations (:pull:`5226`)


📚 Documentation
Expand All @@ -84,16 +91,34 @@ This document explains the changes made to Iris for this release
#. `@bjlittle`_ added the `codespell`_ `pre-commit`_ ``git-hook`` to automate
spell checking within the code-base. (:pull:`5186`)

#. `@bjlittle`_ and `@trexfeathers`_ (reviewer) added a `check-manifest`_
GitHub Action and `pre-commit`_ ``git-hook`` to automate verification
of assets bundled within a ``sdist`` and binary ``wheel`` of our
`scitools-iris`_ PyPI package. (:pull:`5259`)

#. `@rcomer`_ removed a now redundant copying workaround from Resolve testing.
(:pull:`5267`)

#. `@bjlittle`_ and `@trexfeathers`_ (reviewer) migrated ``setup.cfg`` to
``pyproject.toml``, as motivated by `PEP-0621`_. (:pull:`5262`)

#. `@bjlittle`_ adopted `pypa/build`_ recommended best practice to build a
binary ``wheel`` from the ``sdist``. (:pull:`5266`)


.. comment
Whatsnew author names (@github name) in alphabetical order. Note that,
core dev names are automatically included by the common_links.inc:
.. _@fnattino: https://github.com/fnattino


.. comment
Whatsnew resources in alphabetical order:
.. _sphinx-panels: https://github.com/executablebooks/sphinx-panels
.. _sphinx-design: https://github.com/executablebooks/sphinx-design
.. _check-manifest: https://github.com/mgedmin/check-manifest
.. _PEP-0621: https://peps.python.org/pep-0621/
.. _pypa/build: https://pypa-build.readthedocs.io/en/stable/
.. _NEP29: https://numpy.org/neps/nep-0029-deprecation_policy.html
Loading

0 comments on commit 79d93b3

Please sign in to comment.