Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazy netcdf saves #5191

Merged
merged 92 commits into from
Apr 21, 2023
Merged
Changes from 1 commit
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
cd7fa42
Basic functional lazy saving.
pp-mo Oct 21, 2022
1f32800
Simplify function signature which upsets Sphinx.
pp-mo Oct 21, 2022
e0f980f
Non-lazy saves return nothing.
pp-mo Oct 21, 2022
67b96cf
Now fixed to enable use with process/distributed scheduling.
pp-mo Oct 23, 2022
8cdbc9b
Remove dask.utils.SerializableLock, which I think was a mistake.
pp-mo Mar 3, 2023
8723f24
Make DefferedSaveWrapper use _thread_safe_nc.
pp-mo Mar 8, 2023
d19a87f
Fixes for non-lazy save.
pp-mo Mar 9, 2023
45e0e60
Avoid saver error when no deferred writes.
pp-mo Mar 10, 2023
6a83200
Reorganise locking code, ready for shareable locks.
pp-mo Mar 10, 2023
78a9346
Remove optional usage of 'filelock' for lazy saves.
pp-mo Mar 10, 2023
31b12f7
Document dask-specific locking; implement differently for threads or …
pp-mo Mar 10, 2023
99b4f41
Minor fix for unit-tests.
pp-mo Mar 10, 2023
5d0a707
Pin libnetcdf to avoid problems -- see #5187.
pp-mo Mar 10, 2023
431036f
Minor test fix.
pp-mo Mar 10, 2023
dc368d9
Move DeferredSaveWrapper into _thread_safe_nc; replicate the NetCDFDa…
pp-mo Mar 13, 2023
6756a46
Update lib/iris/fileformats/netcdf/saver.py
pp-mo Mar 16, 2023
80b4b6c
Update lib/iris/fileformats/netcdf/_dask_locks.py
pp-mo Mar 16, 2023
78a8716
Update lib/iris/fileformats/netcdf/saver.py
pp-mo Mar 16, 2023
47bb08b
Small rename + reformat.
pp-mo Mar 17, 2023
0ece09a
Remove Saver lazy option; all lazy saves are delayed; factor out fill…
pp-mo Mar 18, 2023
940f544
Merge branch 'main' into lazy_save_2
pp-mo Mar 18, 2023
eb97130
Merge remote-tracking branch 'upstream/main' into lazy_save_2
pp-mo Mar 20, 2023
4596081
Repurposed 'test__FillValueMaskCheckAndStoreTarget' to 'test__data_fi…
pp-mo Mar 20, 2023
ad49fbe
Disable (temporary) saver debug printouts.
pp-mo Mar 20, 2023
b29c927
Fix test problems; Saver automatically completes to preserve existing…
pp-mo Mar 20, 2023
8f10281
Fix docstring error.
pp-mo Mar 20, 2023
6a564d9
Fix spurious error in old saver test.
pp-mo Mar 20, 2023
2fb4d6c
Fix Saver docstring.
pp-mo Mar 20, 2023
c84bfdc
More robust exit for NetCDFWriteProxy operation.
pp-mo Mar 20, 2023
5b78085
Fix doctests by making the Saver example functional.
pp-mo Mar 21, 2023
478332e
Improve docstrings; unify terminology; simplify non-lazy save call.
pp-mo Mar 23, 2023
34f154c
Moved netcdf cell-method handling into nc_load_rules.helpers, and var…
pp-mo Mar 27, 2023
d3744ba
Merge branch 'latest' into lazy_save_2
pp-mo Mar 27, 2023
9673ea0
Fix lockfiles and Makefile process.
pp-mo Mar 27, 2023
bcbcbc8
Add unit tests for routine _fillvalue_report().
pp-mo Mar 27, 2023
05c04a1
Remove debug-only code.
pp-mo Mar 27, 2023
679ea47
Added tests for what the save function does with the 'compute' keyword.
pp-mo Mar 28, 2023
70ec9dd
Fix mock-specific problems, small tidy.
pp-mo Mar 28, 2023
28a4674
Restructure hierarchy of tests.unit.fileformats.netcdf
pp-mo Mar 29, 2023
67f4b2b
Tidy test docstrings.
pp-mo Mar 29, 2023
ebec72f
Correct test import.
pp-mo Mar 29, 2023
1f5b904
Avoid incorrect checking of byte data, and a numpy deprecation warning.
pp-mo Mar 29, 2023
5045c9f
Alter parameter names to make test reports clearer.
pp-mo Mar 29, 2023
393407a
Test basic behaviour of _lazy_stream_data; make 'Saver._delayed_write…
pp-mo Mar 29, 2023
518360b
Add integration tests, and distributed dependency.
pp-mo Mar 30, 2023
5c9931f
Docstring fixes.
pp-mo Mar 31, 2023
7daee68
Documentation section and whatsnew entry.
pp-mo Apr 4, 2023
97474f9
Merge branch 'main' into lazy_save_2
pp-mo Apr 4, 2023
64c7251
Various fixes to whatsnew, docstrings and docs.
pp-mo Apr 4, 2023
75043f9
Minor review changes, fix doctest.
pp-mo Apr 11, 2023
445fbe2
Arrange tests + results to organise by package-name alone.
pp-mo Apr 11, 2023
09cb22e
Review changes.
pp-mo Apr 11, 2023
3445f58
Review changes.
pp-mo Apr 12, 2023
cb1e1f7
Enhance tests + debug.
pp-mo Apr 12, 2023
1c81cee
Support scheduler type 'single-threaded'; allow retries on delayed-sa…
pp-mo Apr 13, 2023
370837b
Improve test.
pp-mo Apr 13, 2023
2f5f3c2
Adding a whatsnew entry for 5224 (#5234)
HGWright Apr 4, 2023
a55c6f2
Replacing numpy legacy printing with array2string and remaking result…
HGWright Apr 4, 2023
4914e99
adding a whatsnew entry
HGWright Apr 4, 2023
bd642cd
configure codecov
HGWright Apr 4, 2023
bc5bdd1
remove results creation commit from blame
HGWright Apr 4, 2023
301e59e
fixing whatsnew entry
HGWright Apr 4, 2023
7b3044d
Bump scitools/workflows from 2023.04.1 to 2023.04.2 (#5236)
dependabot[bot] Apr 5, 2023
02f2b66
Use real array for data of of small netCDF variables. (#5229)
pp-mo Apr 6, 2023
a7e0689
Handle derived coordinates correctly in `concatenate` (#5096)
schlunma Apr 12, 2023
c4e8bbb
clarity on whatsnew entry contributors (#5240)
bjlittle Apr 12, 2023
e6661b8
Modernize and simplify iris.analysis._Groupby (#5015)
bouweandela Apr 12, 2023
afbdbbd
Finalises Lazy Data documentation (#5137)
ESadek-MO Apr 12, 2023
b8bb753
Fixes to _discontiguity_in_bounds (attempt 2) (#4975)
stephenworsley Apr 12, 2023
97cc149
update ci locks location (#5228)
bjlittle Apr 13, 2023
f14a321
Updated environment lockfiles (#5211)
scitools-ci[bot] Apr 13, 2023
f7a0b87
Increase retries.
pp-mo Apr 13, 2023
69ddd9d
Change debug to show which elements failed.
pp-mo Apr 13, 2023
8235d60
update cf standard units (#5244)
ESadek-MO Apr 13, 2023
724c6d2
libnetcdf <4.9 pin (#5242)
trexfeathers Apr 13, 2023
4f50dc7
Avoid possible same-file crossover between tests.
pp-mo Apr 13, 2023
0da68cf
Ensure all-different testfiles; load all vars lazy.
pp-mo Apr 13, 2023
e8b7bfd
Revert changes to testing framework.
pp-mo Apr 13, 2023
ad48caf
Remove repeated line from requirements/py*.yml (?merge error), and re…
pp-mo Apr 13, 2023
291b587
Revert some more debug changes.
pp-mo Apr 13, 2023
b2260ef
Merge branch 'latest' into lazy_save_2
pp-mo Apr 13, 2023
33a7d86
Reorganise test for better code clarity.
pp-mo Apr 14, 2023
db6932d
Use public 'Dataset.isopen()' instead of '._isopen'.
pp-mo Apr 14, 2023
631e001
Create output files in unique temporary directories.
pp-mo Apr 14, 2023
2869f97
Tests for fileformats.netcdf._dask_locks.
pp-mo Apr 14, 2023
419727b
Merge branch 'latest' into lazy_save_2
pp-mo Apr 21, 2023
2f4458b
Fix attribution names.
pp-mo Apr 21, 2023
88b7a2a
Merge branch 'latest' into lazy_save_2
pp-mo Apr 21, 2023
98a20e7
Fixed new py311 lockfile.
pp-mo Apr 21, 2023
bbc1167
Fix typos spotted by codespell.
pp-mo Apr 21, 2023
ed38e43
Add distributed test dep for python 3.11
pp-mo Apr 21, 2023
54ec0f8
Fix lockfile for python 3.11
pp-mo Apr 21, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Remove Saver lazy option; all lazy saves are delayed; factor out fill…
…value checks and make them delayable.
  • Loading branch information
pp-mo committed Mar 18, 2023
commit 0ece09a2a54899ba97154520bebc49e07db9d31b
298 changes: 195 additions & 103 deletions lib/iris/fileformats/netcdf/saver.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@
import dask
import dask.array as da
import numpy as np
import numpy.ma as ma

from iris._lazy_data import _co_realise_lazy_arrays, is_lazy_data
from iris.aux_factory import (
Expand Down Expand Up @@ -468,39 +467,113 @@ def _setncattr(variable, name, attribute):
return variable.setncattr(name, attribute)


class _FillValueMaskCheckAndStoreTarget:
"""
To be used with da.store. Remembers whether any element was equal to a
given value and whether it was masked, before passing the chunk to the
given target.
# NOTE : this matches :class:`iris.experimental.ugrid.mesh.Mesh.ELEMENTS`,
# but in the preferred order for coord/connectivity variables in the file.
MESH_ELEMENTS = ("node", "edge", "face")


_FillvalueCheckInfo = collections.namedtuple(
"_FillvalueCheckInfo", ["user_value", "check_value", "dtype", "varname"]
)


NOTE: target needs to be a _thread_safe_nc._ThreadSafeWrapper subclass.
def _PRINT_DEBUG(*args):
_DO_DEBUG = True
# _DO_DEBUG = False
if _DO_DEBUG:
print(*args)


def _data_fillvalue_check(arraylib, data, check_value):
"""
Check whether an array is masked, and whether it contains a fill-value.

Parameters
----------
arraylib : module
Either numpy or dask.array : When dask, results are lazy computations.
data : array-like
Array to check (numpy or dask)
check_value : number or None
If not None, fill-value to check for existence in the array.
If None, do not do value-in-array check

Returns
-------
is_masked : bool
True if array has any masked points.
contains_value : bool
True if array contains check_value.
Always False if check_value is None.
lbdreyer marked this conversation as resolved.
Show resolved Hide resolved

def __init__(self, target, fill_value=None):
assert hasattr(target, "THREAD_SAFE_FLAG")
self.target = target
self.fill_value = fill_value
self.contains_value = False
self.is_masked = False
"""
is_masked = arraylib.any(arraylib.ma.getmaskarray(data))
if check_value is None:
contains_value = False
else:
contains_value = arraylib.any(data == check_value)
return is_masked, contains_value

def __setitem__(self, keys, arr):
if self.fill_value is not None:
self.contains_value = self.contains_value or self.fill_value in arr
self.is_masked = self.is_masked or ma.is_masked(arr)
self.target[keys] = arr

def _fillvalue_report(fill_info, is_masked, contains_fill_value, warn=False):
"""
From the given information, work out whether there was a possible or actual
fill-value collision, and if so construct a warning.

Parameters
----------
fill_info : dict
A dictinonary containing the context of the fill-value check
is_masked : bool
whether the data arary was masked
lbdreyer marked this conversation as resolved.
Show resolved Hide resolved
contains_fill_value : bool
whether the data array contained the fill-value
warn : bool
if True, also issue any resulting warning immediately.

Returns
-------
None or :class:`Warning`
If not None, indicates a known or possible problem with filling

# NOTE : this matches :class:`iris.experimental.ugrid.mesh.Mesh.ELEMENTS`,
# but in the preferred order for coord/connectivity variables in the file.
MESH_ELEMENTS = ("node", "edge", "face")
"""
varname = fill_info.varname
user_value = fill_info.user_value
check_value = fill_info.check_value
is_byte_data = fill_info.dtype.itemsize == 1
result = None
if is_byte_data and is_masked and user_value is None:
_PRINT_DEBUG(f'Data check "{varname}" : masked byte warning')
result = UserWarning(
f"CF var '{varname}' contains byte data with masked points, but "
"no fill_value keyword was given. As saved, these "
"points will read back as valid values. To save as "
"masked byte data, `_FillValue` needs to be explicitly "
"set. For Cube data this can be done via the 'fill_value' "
"keyword during saving, otherwise use ncedit/equivalent."
)
elif contains_fill_value:
_PRINT_DEBUG(f'Data check "{varname}" : contains-fill warning')
result = UserWarning(
f"CF var '{varname}' contains unmasked data points equal to the "
f"fill-value, {check_value}. As saved, these points will read back "
"as missing data. To save these as normal values, "
"`_FillValue` needs to be set to not equal any valid data "
"points. For Cube data this can be done via the 'fill_value' "
"keyword during saving, otherwise use ncedit/equivalent."
)
else:
_PRINT_DEBUG(f'Data check "{varname}" : all-values-ok')

if warn and result is not None:
warnings.warn(result)
return result


class Saver:
"""A manager for saving netcdf files."""

def __init__(self, filename, netcdf_format, compute=True):
def __init__(self, filename, netcdf_format):
"""
A manager for saving netcdf files.

Expand All @@ -513,15 +586,6 @@ def __init__(self, filename, netcdf_format, compute=True):
Underlying netCDF file format, one of 'NETCDF4', 'NETCDF4_CLASSIC',
'NETCDF3_CLASSIC' or 'NETCDF3_64BIT'. Default is 'NETCDF4' format.

* compute (bool):
If True, the Saver performs normal 'synchronous' data writes, where data
is streamed directly into file variables during the save operation.
If False, the file is created as normal, but computation and streaming of
any lazy array content is instead deferred to :class:`dask.delayed.Delayed`
objects, which are held in a list in the saver 'delayed_writes' property.
The relavant file variables are created empty, and the write can
subsequently be completed by computing the 'save.deferred_writes'.

Returns:
None.

Expand Down Expand Up @@ -560,8 +624,6 @@ def __init__(self, filename, netcdf_format, compute=True):
self._formula_terms_cache = {}
#: Target filepath
self.filepath = os.path.abspath(filename)
#: Whether lazy saving.
self.lazy_saves = not compute
#: A list of deferred writes for lazy saving : each is a (source, target) pair
self.deferred_writes = []
# N.B. the file-write-lock *type* actually depends on the dask scheduler type.
Expand Down Expand Up @@ -2473,101 +2535,125 @@ def _lazy_stream_data(self, data, fill_value, fill_warn, cf_var):
# contains just 1 row, so the cf_var is 1D.
data = data.squeeze(axis=0)

if is_lazy_data(data):
if self.lazy_saves:
# deferred lazy streaming
def store(data, cf_var, fill_value):
# Create a data-writeable object that we can stream into, which
# encapsulates the file to be opened + variable to be written.
write_wrapper = _thread_safe_nc.NetCDFWriteProxy(
self.filepath, cf_var, self.file_write_lock
)
# Add to the list of deferred writes, used in _deferred_save().
self.deferred_writes.append((data, write_wrapper))
# NOTE: in this case, no checking of fill-value violations so just
# return dummy values for this.
# TODO: just for now -- can probably make this work later
is_masked, contains_value = False, False
return is_masked, contains_value

else:
# Immediate streaming store : check mask+fill as we go.
def store(data, cf_var, fill_value):
# Store lazy data and check whether it is masked and contains
# the fill value
target = _FillValueMaskCheckAndStoreTarget(
cf_var, fill_value
)
da.store([data], [target], lock=False)
return target.is_masked, target.contains_value

else:
# Real data is always written directly, i.e. not via lazy save.
def store(data, cf_var, fill_value):
cf_var[:] = data
is_masked = np.ma.is_masked(data)
contains_value = fill_value is not None and fill_value in data
return is_masked, contains_value

# Decide whether we are checking for fill-value collisions.
dtype = cf_var.dtype

# fill_warn allows us to skip warning if packing attributes have been
# specified. It would require much more complex operations to work out
# what the values and fill_value _would_ be in such a case.
if fill_warn:
if fill_value is not None:
fill_value_to_check = fill_value
else:
# Retain 'fill_value == None', to show that no specific value was given.
# But set 'fill_value_to_check' to a calculated value
fill_value_to_check = _thread_safe_nc.default_fillvals[
dtype.str[1:]
]
else:
# A None means we will NOT check for collisions.
fill_value_to_check = None

fill_info = _FillvalueCheckInfo(
user_value=fill_value,
check_value=fill_value_to_check,
dtype=dtype,
varname=cf_var.name,
)

doing_delayed_save = is_lazy_data(data)
if doing_delayed_save:
# save lazy data with a delayed operation. For now, we just record the
# necessary information -- a single, complete delayed action is constructed
# later by a call to _delayed_save().
def store(data, cf_var, fill_value):
# Create a data-writeable object that we can stream into, which
# encapsulates the file to be opened + variable to be written.
write_wrapper = _thread_safe_nc.NetCDFWriteProxy(
self.filepath, cf_var, self.file_write_lock
)
# Add to the list of deferred writes, used in _delayed_save().
self.deferred_writes.append((data, write_wrapper, fill_info))
# In this case, fill-value checking is done later. But return 2 dummy
# values, to be consistent with the non-streamed "store" signature.
is_masked, contains_value = False, False
return is_masked, contains_value

else:
# Real data is always written directly, i.e. not via lazy save.
# We also check it immediately for any fill-value problems.
def store(data, cf_var, fill_value):
cf_var[:] = data
return _data_fillvalue_check(np, data, fill_value)

# Store the data and check if it is masked and contains the fill value.
is_masked, contains_fill_value = store(
data, cf_var, fill_value_to_check
)

if dtype.itemsize == 1 and fill_value is None:
if is_masked:
msg = (
"CF var '{}' contains byte data with masked points, but "
"no fill_value keyword was given. As saved, these "
"points will read back as valid values. To save as "
"masked byte data, `_FillValue` needs to be explicitly "
"set. For Cube data this can be done via the 'fill_value' "
"keyword during saving, otherwise use ncedit/equivalent."
)
warnings.warn(msg.format(cf_var.name))
elif contains_fill_value:
msg = (
"CF var '{}' contains unmasked data points equal to the "
"fill-value, {}. As saved, these points will read back "
"as missing data. To save these as normal values, "
"`_FillValue` needs to be set to not equal any valid data "
"points. For Cube data this can be done via the 'fill_value' "
"keyword during saving, otherwise use ncedit/equivalent."
if doing_delayed_save:
_PRINT_DEBUG(
f'Data check "{fill_info.varname}" : NO CHECK YET (delayed)'
)
else:
# Issue a fill-value warning immediately, if appropriate.
_fillvalue_report(
fill_info, is_masked, contains_fill_value, warn=True
)
warnings.warn(msg.format(cf_var.name, fill_value))

def _deferred_save(self):
def _delayed_save(self):
"""
Create a 'delayed' to trigger file completion for lazy saves.

This contains all the deferred writes, which complete the file by filling out
the data of variables initially created empty.
the data of variables initially created empty, and also the checks for
potential fill-value collisions.

"""
if self.deferred_writes:
# Create a single delayed da.store operation to complete the file.
sources, targets = zip(*self.deferred_writes)
result = da.store(sources, targets, compute=False, lock=False)
sources, targets, fill_infos = zip(*self.deferred_writes)
store_op = da.store(sources, targets, compute=False, lock=False)

# Construct a delayed fill-check operation for each (lazy) source array.
delayed_fillvalue_checks = [
# NB with arraylib=dask.array, this routine does lazy array computation
_data_fillvalue_check(da, source, fillinfo.check_value)
for source, fillinfo in zip(sources, fill_infos)
]

# Return a single delayed object which completes the delayed saves and
# returns a list of any fill-value warnings.
@dask.delayed
def compute_and_return_warnings(store_op, fv_infos, fv_checks):
# Note: we don't actually *do* anything with the store_op, but
# including it here ensures that dask will compute it (thus performing
# all the delayed saves), before calling this function.
results = []
# Pair each fill_check result (is_masked, contains_value) with its
# fillinfo and construct a suitable Warning if needed.
for fillinfo, (is_masked, contains_value) in zip(
fv_infos, fv_checks
):
fv_warning = _fillvalue_report(
fill_info=fillinfo,
is_masked=is_masked,
contains_fill_value=contains_value,
)
if fv_warning is not None:
# Collect the warnings and return them.
results.append(fv_warning)
return results

result = compute_and_return_warnings(
store_op,
fv_infos=fill_infos,
fv_checks=delayed_fillvalue_checks,
)

else:
# Return a delayed anyway, just for usage consistency.
# Return a delayed, which returns an empty list, for usage consistency.
@dask.delayed
def no_op():
return None
return []

result = no_op()

Expand Down Expand Up @@ -2720,9 +2806,12 @@ def save(
compute all the lazy content and stream it to complete the file.
Several such data saves can be performed in parallel, by passing a list of them
into a :func:`dask.compute` call.
Note: when computed, the returned class:`dask.delayed.Delayed` object returns
a list of :class:`Warning` : These are any warnings that _would_ have been
issued in the save call, if compute had been True.

Returns:
None.
A list of :class:`Warning`.

.. note::

Expand Down Expand Up @@ -2823,7 +2912,7 @@ def is_valid_packspec(p):
# Initialise Manager for saving
# N.B. FOR NOW -- we are cheating and making all saves compute=False, as otherwise
# non-lazy saves do *not* work with the distributed scheduler.
with Saver(filename, netcdf_format, compute=False) as sman:
with Saver(filename, netcdf_format) as sman:
# Iterate through the cubelist.
for cube, packspec, fill_value in zip(cubes, packspecs, fill_values):
sman.write(
Expand Down Expand Up @@ -2869,11 +2958,14 @@ def is_valid_packspec(p):
# Add conventions attribute.
sman.update_global_attributes(Conventions=conventions)

# For now, not using Saver(compute=True) as it doesn't work with distributed or
# process workers (only threaded).
result = sman._deferred_save()
result = sman._delayed_save()
if compute:
result = result.compute()
# Complete the saves now, and handle any delayed warnings that occurred
result_warnings = result.compute()
# Issue any delayed warnings from the compute.
for delayed_warning in result_warnings:
warnings.warn(delayed_warning)

result = None

return result