Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazy netcdf saves #5191

Merged
merged 92 commits into from
Apr 21, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
cd7fa42
Basic functional lazy saving.
pp-mo Oct 21, 2022
1f32800
Simplify function signature which upsets Sphinx.
pp-mo Oct 21, 2022
e0f980f
Non-lazy saves return nothing.
pp-mo Oct 21, 2022
67b96cf
Now fixed to enable use with process/distributed scheduling.
pp-mo Oct 23, 2022
8cdbc9b
Remove dask.utils.SerializableLock, which I think was a mistake.
pp-mo Mar 3, 2023
8723f24
Make DefferedSaveWrapper use _thread_safe_nc.
pp-mo Mar 8, 2023
d19a87f
Fixes for non-lazy save.
pp-mo Mar 9, 2023
45e0e60
Avoid saver error when no deferred writes.
pp-mo Mar 10, 2023
6a83200
Reorganise locking code, ready for shareable locks.
pp-mo Mar 10, 2023
78a9346
Remove optional usage of 'filelock' for lazy saves.
pp-mo Mar 10, 2023
31b12f7
Document dask-specific locking; implement differently for threads or …
pp-mo Mar 10, 2023
99b4f41
Minor fix for unit-tests.
pp-mo Mar 10, 2023
5d0a707
Pin libnetcdf to avoid problems -- see #5187.
pp-mo Mar 10, 2023
431036f
Minor test fix.
pp-mo Mar 10, 2023
dc368d9
Move DeferredSaveWrapper into _thread_safe_nc; replicate the NetCDFDa…
pp-mo Mar 13, 2023
6756a46
Update lib/iris/fileformats/netcdf/saver.py
pp-mo Mar 16, 2023
80b4b6c
Update lib/iris/fileformats/netcdf/_dask_locks.py
pp-mo Mar 16, 2023
78a8716
Update lib/iris/fileformats/netcdf/saver.py
pp-mo Mar 16, 2023
47bb08b
Small rename + reformat.
pp-mo Mar 17, 2023
0ece09a
Remove Saver lazy option; all lazy saves are delayed; factor out fill…
pp-mo Mar 18, 2023
940f544
Merge branch 'main' into lazy_save_2
pp-mo Mar 18, 2023
eb97130
Merge remote-tracking branch 'upstream/main' into lazy_save_2
pp-mo Mar 20, 2023
4596081
Repurposed 'test__FillValueMaskCheckAndStoreTarget' to 'test__data_fi…
pp-mo Mar 20, 2023
ad49fbe
Disable (temporary) saver debug printouts.
pp-mo Mar 20, 2023
b29c927
Fix test problems; Saver automatically completes to preserve existing…
pp-mo Mar 20, 2023
8f10281
Fix docstring error.
pp-mo Mar 20, 2023
6a564d9
Fix spurious error in old saver test.
pp-mo Mar 20, 2023
2fb4d6c
Fix Saver docstring.
pp-mo Mar 20, 2023
c84bfdc
More robust exit for NetCDFWriteProxy operation.
pp-mo Mar 20, 2023
5b78085
Fix doctests by making the Saver example functional.
pp-mo Mar 21, 2023
478332e
Improve docstrings; unify terminology; simplify non-lazy save call.
pp-mo Mar 23, 2023
34f154c
Moved netcdf cell-method handling into nc_load_rules.helpers, and var…
pp-mo Mar 27, 2023
d3744ba
Merge branch 'latest' into lazy_save_2
pp-mo Mar 27, 2023
9673ea0
Fix lockfiles and Makefile process.
pp-mo Mar 27, 2023
bcbcbc8
Add unit tests for routine _fillvalue_report().
pp-mo Mar 27, 2023
05c04a1
Remove debug-only code.
pp-mo Mar 27, 2023
679ea47
Added tests for what the save function does with the 'compute' keyword.
pp-mo Mar 28, 2023
70ec9dd
Fix mock-specific problems, small tidy.
pp-mo Mar 28, 2023
28a4674
Restructure hierarchy of tests.unit.fileformats.netcdf
pp-mo Mar 29, 2023
67f4b2b
Tidy test docstrings.
pp-mo Mar 29, 2023
ebec72f
Correct test import.
pp-mo Mar 29, 2023
1f5b904
Avoid incorrect checking of byte data, and a numpy deprecation warning.
pp-mo Mar 29, 2023
5045c9f
Alter parameter names to make test reports clearer.
pp-mo Mar 29, 2023
393407a
Test basic behaviour of _lazy_stream_data; make 'Saver._delayed_write…
pp-mo Mar 29, 2023
518360b
Add integration tests, and distributed dependency.
pp-mo Mar 30, 2023
5c9931f
Docstring fixes.
pp-mo Mar 31, 2023
7daee68
Documentation section and whatsnew entry.
pp-mo Apr 4, 2023
97474f9
Merge branch 'main' into lazy_save_2
pp-mo Apr 4, 2023
64c7251
Various fixes to whatsnew, docstrings and docs.
pp-mo Apr 4, 2023
75043f9
Minor review changes, fix doctest.
pp-mo Apr 11, 2023
445fbe2
Arrange tests + results to organise by package-name alone.
pp-mo Apr 11, 2023
09cb22e
Review changes.
pp-mo Apr 11, 2023
3445f58
Review changes.
pp-mo Apr 12, 2023
cb1e1f7
Enhance tests + debug.
pp-mo Apr 12, 2023
1c81cee
Support scheduler type 'single-threaded'; allow retries on delayed-sa…
pp-mo Apr 13, 2023
370837b
Improve test.
pp-mo Apr 13, 2023
2f5f3c2
Adding a whatsnew entry for 5224 (#5234)
HGWright Apr 4, 2023
a55c6f2
Replacing numpy legacy printing with array2string and remaking result…
HGWright Apr 4, 2023
4914e99
adding a whatsnew entry
HGWright Apr 4, 2023
bd642cd
configure codecov
HGWright Apr 4, 2023
bc5bdd1
remove results creation commit from blame
HGWright Apr 4, 2023
301e59e
fixing whatsnew entry
HGWright Apr 4, 2023
7b3044d
Bump scitools/workflows from 2023.04.1 to 2023.04.2 (#5236)
dependabot[bot] Apr 5, 2023
02f2b66
Use real array for data of of small netCDF variables. (#5229)
pp-mo Apr 6, 2023
a7e0689
Handle derived coordinates correctly in `concatenate` (#5096)
schlunma Apr 12, 2023
c4e8bbb
clarity on whatsnew entry contributors (#5240)
bjlittle Apr 12, 2023
e6661b8
Modernize and simplify iris.analysis._Groupby (#5015)
bouweandela Apr 12, 2023
afbdbbd
Finalises Lazy Data documentation (#5137)
ESadek-MO Apr 12, 2023
b8bb753
Fixes to _discontiguity_in_bounds (attempt 2) (#4975)
stephenworsley Apr 12, 2023
97cc149
update ci locks location (#5228)
bjlittle Apr 13, 2023
f14a321
Updated environment lockfiles (#5211)
scitools-ci[bot] Apr 13, 2023
f7a0b87
Increase retries.
pp-mo Apr 13, 2023
69ddd9d
Change debug to show which elements failed.
pp-mo Apr 13, 2023
8235d60
update cf standard units (#5244)
ESadek-MO Apr 13, 2023
724c6d2
libnetcdf <4.9 pin (#5242)
trexfeathers Apr 13, 2023
4f50dc7
Avoid possible same-file crossover between tests.
pp-mo Apr 13, 2023
0da68cf
Ensure all-different testfiles; load all vars lazy.
pp-mo Apr 13, 2023
e8b7bfd
Revert changes to testing framework.
pp-mo Apr 13, 2023
ad48caf
Remove repeated line from requirements/py*.yml (?merge error), and re…
pp-mo Apr 13, 2023
291b587
Revert some more debug changes.
pp-mo Apr 13, 2023
b2260ef
Merge branch 'latest' into lazy_save_2
pp-mo Apr 13, 2023
33a7d86
Reorganise test for better code clarity.
pp-mo Apr 14, 2023
db6932d
Use public 'Dataset.isopen()' instead of '._isopen'.
pp-mo Apr 14, 2023
631e001
Create output files in unique temporary directories.
pp-mo Apr 14, 2023
2869f97
Tests for fileformats.netcdf._dask_locks.
pp-mo Apr 14, 2023
419727b
Merge branch 'latest' into lazy_save_2
pp-mo Apr 21, 2023
2f4458b
Fix attribution names.
pp-mo Apr 21, 2023
88b7a2a
Merge branch 'latest' into lazy_save_2
pp-mo Apr 21, 2023
98a20e7
Fixed new py311 lockfile.
pp-mo Apr 21, 2023
bbc1167
Fix typos spotted by codespell.
pp-mo Apr 21, 2023
ed38e43
Add distributed test dep for python 3.11
pp-mo Apr 21, 2023
54ec0f8
Fix lockfile for python 3.11
pp-mo Apr 21, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Handle derived coordinates correctly in concatenate (#5096)
* First working prototype of concatenate that handels derived coordinates correctly

* Added checks for derived coord metadata during concatenation

* Added tests

* Fixed defaults

* Added what's new entry

* Optimized test coverage
  • Loading branch information
schlunma authored and pp-mo committed Apr 13, 2023
commit a7e06894d5591703a9ceb5e9599bd59b5a921a99
3 changes: 3 additions & 0 deletions docs/src/whatsnew/latest.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,9 @@ This document explains the changes made to Iris for this release
🐛 Bugs Fixed
=============

#. `@schlunma`_ fixed :meth:`iris.cube.CubeList.concatenate` so that it
preserves derived coordinates. (:issue:`2478`, :pull:`5096`)

#. `@trexfeathers`_ and `@pp-mo`_ made Iris' use of the `netCDF4`_ library
thread-safe. (:pull:`5095`)

Expand Down
186 changes: 186 additions & 0 deletions lib/iris/_concatenate.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,39 @@ def name(self):
return self.defn.name()


class _DerivedCoordAndDims(
namedtuple("DerivedCoordAndDims", ["coord", "dims", "aux_factory"])
):
"""
Container for a derived coordinate, the associated AuxCoordFactory, and the
associated data dimension(s) spanned over a :class:`iris.cube.Cube`.

Args:

* coord:
A :class:`iris.coords.DimCoord` or :class:`iris.coords.AuxCoord`
coordinate instance.

* dims:
A tuple of the data dimension(s) spanned by the coordinate.

* aux_factory:
A :class:`iris.aux_factory.AuxCoordFactory` instance.

"""

__slots__ = ()

def __eq__(self, other):
"""Do not take aux factories into account for equality."""
result = NotImplemented
if isinstance(other, _DerivedCoordAndDims):
equal_coords = self.coord == other.coord
equal_dims = self.dims == other.dims
result = equal_coords and equal_dims
return result


class _OtherMetaData(namedtuple("OtherMetaData", ["defn", "dims"])):
"""
Container for the metadata that defines a cell measure or ancillary
Expand Down Expand Up @@ -280,6 +313,7 @@ def concatenate(
check_aux_coords=True,
check_cell_measures=True,
check_ancils=True,
check_derived_coords=True,
):
"""
Concatenate the provided cubes over common existing dimensions.
Expand All @@ -296,6 +330,30 @@ def concatenate(
If True, raise an informative
:class:`~iris.exceptions.ContatenateError` if registration fails.

* check_aux_coords
Checks if the points and bounds of auxiliary coordinates of the cubes
match. This check is not applied to auxiliary coordinates that span the
dimension the concatenation is occurring along. Defaults to True.

* check_cell_measures
Checks if the data of cell measures of the cubes match. This check is
not applied to cell measures that span the dimension the concatenation
is occurring along. Defaults to True.

* check_ancils
Checks if the data of ancillary variables of the cubes match. This
check is not applied to ancillary variables that span the dimension the
concatenation is occurring along. Defaults to True.

* check_derived_coords
Checks if the points and bounds of derived coordinates of the cubes
match. This check is not applied to derived coordinates that span the
dimension the concatenation is occurring along. Note that differences
in scalar coordinates and dimensional coordinates used to derive the
coordinate are still checked. Checks for auxiliary coordinates used to
derive the coordinates can be ignored with `check_aux_coords`. Defaults
to True.

Returns:
A :class:`iris.cube.CubeList` of concatenated :class:`iris.cube.Cube`
instances.
Expand All @@ -321,6 +379,7 @@ def concatenate(
check_aux_coords,
check_cell_measures,
check_ancils,
check_derived_coords,
)
if registered:
axis = proto_cube.axis
Expand Down Expand Up @@ -378,6 +437,8 @@ def __init__(self, cube):
self.cm_metadata = []
self.ancillary_variables_and_dims = []
self.av_metadata = []
self.derived_coords_and_dims = []
self.derived_metadata = []
self.dim_mapping = []

# Determine whether there are any anonymous cube dimensions.
Expand Down Expand Up @@ -437,6 +498,17 @@ def meta_key_func(dm):
av_and_dims = _CoordAndDims(av, tuple(dims))
self.ancillary_variables_and_dims.append(av_and_dims)

def name_key_func(factory):
return factory.name()

for factory in sorted(cube.aux_factories, key=name_key_func):
coord = factory.make_coord(cube.coord_dims)
dims = cube.coord_dims(coord)
metadata = _CoordMetaData(coord, dims)
self.derived_metadata.append(metadata)
coord_and_dims = _DerivedCoordAndDims(coord, tuple(dims), factory)
self.derived_coords_and_dims.append(coord_and_dims)

def _coordinate_differences(self, other, attr, reason="metadata"):
"""
Determine the names of the coordinates that differ between `self` and
Expand Down Expand Up @@ -544,6 +616,14 @@ def match(self, other, error_on_mismatch):
msgs.append(
msg_template.format("Ancillary variables", *differences)
)
# Check derived coordinates.
if self.derived_metadata != other.derived_metadata:
differences = self._coordinate_differences(
other, "derived_metadata"
)
msgs.append(
msg_template.format("Derived coordinates", *differences)
)
# Check scalar coordinates.
if self.scalar_coords != other.scalar_coords:
differences = self._coordinate_differences(
Expand Down Expand Up @@ -597,6 +677,7 @@ def __init__(self, cube_signature):
self.ancillary_variables_and_dims = (
cube_signature.ancillary_variables_and_dims
)
self.derived_coords_and_dims = cube_signature.derived_coords_and_dims
self.dim_coords = cube_signature.dim_coords
self.dim_mapping = cube_signature.dim_mapping
self.dim_extents = []
Expand Down Expand Up @@ -779,6 +860,11 @@ def concatenate(self):
# Concatenate the new ancillary variables
ancillary_variables_and_dims = self._build_ancillary_variables()

# Concatenate the new aux factories
aux_factories = self._build_aux_factories(
dim_coords_and_dims, aux_coords_and_dims
)

# Concatenate the new data payload.
data = self._build_data()

Expand All @@ -790,6 +876,7 @@ def concatenate(self):
aux_coords_and_dims=aux_coords_and_dims,
cell_measures_and_dims=cell_measures_and_dims,
ancillary_variables_and_dims=ancillary_variables_and_dims,
aux_factories=aux_factories,
**kwargs,
)
else:
Expand All @@ -807,6 +894,7 @@ def register(
check_aux_coords=False,
check_cell_measures=False,
check_ancils=False,
check_derived_coords=False,
):
"""
Determine whether the given source-cube is suitable for concatenation
Expand All @@ -827,6 +915,31 @@ def register(
* error_on_mismatch:
If True, raise an informative error if registration fails.

* check_aux_coords
Checks if the points and bounds of auxiliary coordinates of the
cubes match. This check is not applied to auxiliary coordinates
that span the dimension the concatenation is occurring along.
Defaults to False.

* check_cell_measures
Checks if the data of cell measures of the cubes match. This check
is not applied to cell measures that span the dimension the
concatenation is occurring along. Defaults to False.

* check_ancils
Checks if the data of ancillary variables of the cubes match. This
check is not applied to ancillary variables that span the dimension
the concatenation is occurring along. Defaults to False.

* check_derived_coords
Checks if the points and bounds of derived coordinates of the cubes
match. This check is not applied to derived coordinates that span
the dimension the concatenation is occurring along. Note that
differences in scalar coordinates and dimensional coordinates used
to derive the coordinate are still checked. Checks for auxiliary
coordinates used to derive the coordinates can be ignored with
`check_aux_coords`. Defaults to False.

Returns:
Boolean.

Expand Down Expand Up @@ -905,6 +1018,21 @@ def register(
if not coord_a == coord_b:
match = False

# Check for compatible derived coordinates.
if match:
if check_derived_coords:
for coord_a, coord_b in zip(
self._cube_signature.derived_coords_and_dims,
cube_signature.derived_coords_and_dims,
):
# Derived coords that span the candidate axis can differ
if (
candidate_axis not in coord_a.dims
or candidate_axis not in coord_b.dims
):
if not coord_a == coord_b:
match = False

if match:
# Register the cube as a source-cube for this proto-cube.
self._add_skeleton(coord_signature, cube.lazy_data())
Expand Down Expand Up @@ -1088,6 +1216,64 @@ def _build_ancillary_variables(self):

return ancillary_variables_and_dims

def _build_aux_factories(self, dim_coords_and_dims, aux_coords_and_dims):
"""
Generate the aux factories for the new concatenated cube.

Args:

* dim_coords_and_dims:
A list of dimension coordinate and dimension tuple pairs from the
concatenated cube.

* aux_coords_and_dims:
A list of auxiliary coordinates and dimension(s) tuple pairs from
the concatenated cube.

Returns:
A list of :class:`iris.aux_factory.AuxCoordFactory`.

"""
# Setup convenience hooks.
cube_signature = self._cube_signature
old_dim_coords = cube_signature.dim_coords
old_aux_coords = [a[0] for a in cube_signature.aux_coords_and_dims]
new_dim_coords = [d[0] for d in dim_coords_and_dims]
new_aux_coords = [a[0] for a in aux_coords_and_dims]
scalar_coords = cube_signature.scalar_coords

aux_factories = []

# Generate all the factories for the new concatenated cube.
for i, (coord, dims, factory) in enumerate(
cube_signature.derived_coords_and_dims
):
# Check whether the derived coordinate of the factory spans the
# nominated dimension of concatenation.
if self.axis in dims:
# Update the dependencies of the factory with coordinates of
# the concatenated cube. We need to check all coordinate types
# here (dim coords, aux coords, and scalar coords).
new_dependencies = {}
for old_dependency in factory.dependencies.values():
if old_dependency in old_dim_coords:
dep_idx = old_dim_coords.index(old_dependency)
new_dependency = new_dim_coords[dep_idx]
elif old_dependency in old_aux_coords:
dep_idx = old_aux_coords.index(old_dependency)
new_dependency = new_aux_coords[dep_idx]
else:
dep_idx = scalar_coords.index(old_dependency)
new_dependency = scalar_coords[dep_idx]
new_dependencies[id(old_dependency)] = new_dependency

# Create new factory with the updated dependencies.
factory = factory.updated(new_dependencies)

aux_factories.append(factory)

return aux_factories

def _build_data(self):
"""
Generate the data payload for the new concatenated cube.
Expand Down
Loading