Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More correct deprecation warning for lock argument #5256

Merged
merged 6 commits into from
May 4, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -131,6 +131,11 @@ Deprecations
:py:func:`xarray.open_mfdataset` when `combine='by_coords'` is specified.
Fixes (:issue:`5230`), via (:pull:`5231`, :pull:`5255`).
By `Tom Nicholas <https://github.com/TomNicholas>`_.
- The `lock` keyword argument to :py:func:`open_dataset` and :py:func:`open_dataarray` is now
a backend specific option. It will give a warning if passed to a backend that doesn't support it
instead of being silently ignored. From the next version it will raise an error.
This is part of the refactor to support external backends (:issue:`5073`).
By `Tom Nicholas <https://github.com/TomNicholas>`_ and `Alessandro Amici <https://github.com/alexamici>`_.


Bug fixes
Expand Down
4 changes: 2 additions & 2 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -449,7 +449,7 @@ def open_dataset(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"pynio", "pseudonetcdf", "cfgrib".
"scipy", "pynio", "pseudonetcdf", "cfgrib".

See engine open function for kwargs accepted by each specific engine.

Expand Down Expand Up @@ -633,7 +633,7 @@ def open_dataarray(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"pynio", "pseudonetcdf", "cfgrib".
"scipy", "pynio", "pseudonetcdf", "cfgrib".

See engine open function for kwargs accepted by each specific engine.

Expand Down
11 changes: 11 additions & 0 deletions xarray/backends/pydap_.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
import warnings

import numpy as np

from ..core import indexing
Expand Down Expand Up @@ -122,7 +124,16 @@ def open_dataset(
use_cftime=None,
decode_timedelta=None,
session=None,
lock=None,
):
# TODO remove after v0.19
if lock is not None:
warnings.warn(
"The kwarg 'lock' has been deprecated for this backend, and is now "
"ignored. In the future passing lock will raise an error.",
DeprecationWarning,
)

store = PydapDataStore.open(
filename_or_obj,
session=session,
Expand Down
9 changes: 9 additions & 0 deletions xarray/backends/zarr.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import os
import pathlib
import warnings
from distutils.version import LooseVersion

import numpy as np
Expand Down Expand Up @@ -721,7 +722,15 @@ def open_dataset(
consolidate_on_close=False,
chunk_store=None,
storage_options=None,
lock=None,
):
# TODO remove after v0.19
if lock is not None:
warnings.warn(
"The kwarg 'lock' has been deprecated for this backend, and is now "
"ignored. In the future passing lock will raise an error.",
DeprecationWarning,
)

filename_or_obj = _normalize_path(filename_or_obj)
store = ZarrStore.open_group(
Expand Down