Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/PCMDI/pcmdi_metrics into ao…
Browse files Browse the repository at this point in the history
…_sperber_patch
  • Loading branch information
Ana Ordonez committed May 2, 2024
2 parents 2dc5501 + edb94c9 commit a0d50e4
Show file tree
Hide file tree
Showing 43 changed files with 2,656 additions and 1,675 deletions.
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ repos:
- id: black

- repo: https://github.com/timothycrosley/isort
rev: 5.12.0
rev: 5.13.2
hooks:
- id: isort
args: ["--honor-noqa"]
Expand All @@ -34,7 +34,7 @@ repos:
# Python linting
# =======================
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
rev: 7.0.0
hooks:
- id: flake8
args: ["--config=setup.cfg"]
Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,11 @@ Documentation
* [View Demo](https://github.com/PCMDI/pcmdi_metrics/blob/main/doc/jupyter/Demo/README.md)


**Reference**

Lee, J., P. J. Gleckler, M.-S. Ahn, A. Ordonez, P. Ullrich, K. R. Sperber, K. E. Taylor, Y. Y. Planton, E. Guilyardi, P. Durack, C. Bonfils, M. D. Zelinka, L.-W. Chao, B. Dong, C. Doutriaux, C. Zhang, T. Vo, J. Boutte, M. F. Wehner, A. G. Pendergrass, D. Kim, Z. Xue, A. T. Wittenberg, and J. Krasting, 2024: Systematic and Objective Evaluation of Earth System Models: PCMDI Metrics Package (PMP) version 3. Geoscientific Model Development (_accepted, publication in progress_) [[preprint](https://egusphere.copernicus.org/preprints/2023/egusphere-2023-2720/)]


Contact
-------

Expand Down Expand Up @@ -104,6 +109,7 @@ Release Notes and History

| <div style="width:300%">[Versions]</div> | Update summary |
| ------------- | ------------------------------------- |
| [v3.4] | Technical update: Modes of variability [xCDAT](https://xcdat.readthedocs.io/en/latest/) conversion
| [v3.3.4] | Technical update
| [v3.3.3] | Technical update
| [v3.3.2] | Technical update
Expand Down Expand Up @@ -136,6 +142,7 @@ Release Notes and History


[Versions]: https://github.com/PCMDI/pcmdi_metrics/releases
[v3.4]: https://github.com/PCMDI/pcmdi_metrics/releases/tag/v3.4
[v3.3.4]: https://github.com/PCMDI/pcmdi_metrics/releases/tag/v3.3.4
[v3.3.3]: https://github.com/PCMDI/pcmdi_metrics/releases/tag/v3.3.3
[v3.3.2]: https://github.com/PCMDI/pcmdi_metrics/releases/tag/v3.3.2
Expand Down
4 changes: 2 additions & 2 deletions conda-env/dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,10 @@ dependencies:
- genutil=8.2.1
- cdutil=8.2.1
- cdp=1.7.0
- eofs=1.4.0
- eofs=1.4.1
- seaborn=0.12.2
- enso_metrics=1.1.1
- xcdat>=0.6.1
- xcdat>=0.7.0
- xmltodict=0.13.0
- setuptools=67.7.2
- netcdf4=1.6.3
Expand Down
1,102 changes: 452 additions & 650 deletions doc/jupyter/Demo/Demo_4_modes_of_variability.ipynb

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ such as datasets from the `obs4MIPs`_ project.

References
==========
Lee et al. (in prep, to be submitted soon), Objective Evaluation of Earth System Models: PCMDI Metrics Package (PMP) version 3, Geoscientific Model Development
Lee, J., P. J. Gleckler, M.-S. Ahn, A. Ordonez, P. Ullrich, K. R. Sperber, K. E. Taylor, Y. Y. Planton, E. Guilyardi, P. Durack, C. Bonfils, M. D. Zelinka, L.-W. Chao, B. Dong, C. Doutriaux, C. Zhang, T. Vo, J. Boutte, M. F. Wehner, A. G. Pendergrass, D. Kim, Z. Xue, A. T. Wittenberg, and J. Krasting, 2024: Systematic and Objective Evaluation of Earth System Models: PCMDI Metrics Package (PMP) version 3. Geoscientific Model Development (accepted, publication in progress) [`preprint<https://egusphere.copernicus.org/preprints/2023/egusphere-2023-2720/>`_].

Gleckler et al. (2016), A more powerful reality test for climate models, Eos, 97, `doi:10.1029/2016EO051663 <https://eos.org/science-updates/a-more-powerful-reality-test-for-climate-models>`_.

Expand Down
3 changes: 2 additions & 1 deletion docs/metrics_sea_ice.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ or as a combination of an input parameter file and the command line, e.g.: ::

Outputs
=======
The driver produces a JSON file containing mean square error metrics for all input models and realizations relative to the reference data set. It also produces a bar chart displaying these metrics.
The driver produces two JSON files. The first contains mean square error metrics for all input models and realizations relative to the reference data set. The second contains sea ice climatology and area data. The driver also produces a bar chart displaying these metrics.

Sectors
########
Expand Down Expand Up @@ -66,6 +66,7 @@ A `demo parameter file`_ is provided in the sea ice code.
* **obs_area_template**: File path of grid area data. If unavailalbe, skip and use "obs_cell_area".
* **obs_area_var**: Name of reference area variable, if available. If unavailable, skip and use "obs_cell_area".
* **obs_cell_area**: For equal area grids, the area of a single grid cell in units of km :sup:`2` . Only required if obs area file is not available.
* **pole**: Set the maximum latitude for the Central Arctic and Arctic regions to exclude ice over the pole. Default is 90.1 to include all ice.

Reference
=========
Expand Down
6 changes: 3 additions & 3 deletions pcmdi_metrics/graphics/portrait_plot/portrait_plot_lib.py
Original file line number Diff line number Diff line change
Expand Up @@ -342,12 +342,12 @@ def portrait_plot(
# ----------------------------------------------------------------------
def prepare_data(data, xaxis_labels, yaxis_labels, debug=False):
# In case data was given as list of arrays, convert it to numpy (stacked) array
if type(data) == list:
if isinstance(data, list):
if debug:
print("data type is list")
print("len(data):", len(data))
if len(data) == 1: # list has only 1 array as element
if (type(data[0]) == np.ndarray) and (len(data[0].shape) == 2):
if isinstance(data[0], np.ndarray) and (len(data[0].shape) == 2):
data = data[0]
num_divide = 1
else:
Expand All @@ -366,7 +366,7 @@ def prepare_data(data, xaxis_labels, yaxis_labels, debug=False):
if data.shape[-2] != len(yaxis_labels) and len(yaxis_labels) > 0:
sys.exit("Error: Number of elements in yaxis_label mismatchs to the data")

if type(data) == np.ndarray:
if isinstance(data, np.ndarray):
# data = np.squeeze(data)
if len(data.shape) == 2:
num_divide = 1
Expand Down
8 changes: 5 additions & 3 deletions pcmdi_metrics/io/__init__.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
# init for pcmdi_metrics.io
from .xcdat_openxml import xcdat_open # noqa # isort:skip
from .string_constructor import StringConstructor, fill_template # noqa # isort:skip
from . import base # noqa
from .base import MV2Json # noqa
from .default_regions_define import load_regions_specs # noqa
from .default_regions_define import region_subset # noqa
from .xcdat_dataset_io import ( # noqa
from .xcdat_dataset_io import ( # noqa # isort:skip
da_to_ds,
get_axis_list,
get_data_list,
get_grid,
get_latitude_bounds_key,
get_latitude_key,
get_latitude,
Expand All @@ -21,3 +22,4 @@
get_time_key,
select_subset,
)
from .regions import load_regions_specs, region_subset # noqa
2 changes: 1 addition & 1 deletion pcmdi_metrics/io/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@

import pcmdi_metrics
from pcmdi_metrics import LOG_LEVEL
from pcmdi_metrics.utils import StringConstructor
from pcmdi_metrics.io import StringConstructor

value = 0
cdms2.setNetcdfShuffleFlag(value) # where value is either 0 or 1
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
from typing import Union

import xarray as xr
import xcdat as xc

from pcmdi_metrics.io import da_to_ds, get_longitude, select_subset


def load_regions_specs():
def load_regions_specs() -> dict:
regions_specs = {
# Mean Climate
"global": {},
Expand Down Expand Up @@ -35,7 +40,10 @@ def load_regions_specs():
"NAO": {"domain": {"latitude": (20.0, 80), "longitude": (-90, 40)}},
"SAM": {"domain": {"latitude": (-20.0, -90), "longitude": (0, 360)}},
"PNA": {"domain": {"latitude": (20.0, 85), "longitude": (120, 240)}},
"NPO": {"domain": {"latitude": (20.0, 85), "longitude": (120, 240)}},
"PDO": {"domain": {"latitude": (20.0, 70), "longitude": (110, 260)}},
"NPGO": {"domain": {"latitude": (20.0, 70), "longitude": (110, 260)}},
"AMO": {"domain": {"latitude": (0.0, 70), "longitude": (-80, 0)}},
# Monsoon domains for Wang metrics
# All monsoon domains
"AllMW": {"domain": {"latitude": (-40.0, 45.0), "longitude": (0.0, 360.0)}},
Expand All @@ -45,7 +53,8 @@ def load_regions_specs():
# South American Monsoon
"SAMM": {"domain": {"latitude": (-45.0, 0.0), "longitude": (240.0, 330.0)}},
# North African Monsoon
"NAFM": {"domain": {"latitude": (0.0, 45.0), "longitude": (310.0, 60.0)}},
# "NAFM": {"domain": {"latitude": (0.0, 45.0), "longitude": (310.0, 60.0)}},
"NAFM": {"domain": {"latitude": (0.0, 45.0), "longitude": (-50.0, 60.0)}},
# South African Monsoon
"SAFM": {"domain": {"latitude": (-45.0, 0.0), "longitude": (0.0, 90.0)}},
# Asian Summer Monsoon
Expand All @@ -70,55 +79,77 @@ def load_regions_specs():
return regions_specs


def region_subset(ds, regions_specs, region=None):
"""
d: xarray.Dataset
regions_specs: dict
region: string
def region_subset(
ds: Union[xr.Dataset, xr.DataArray],
region: str,
data_var: str = "variable",
regions_specs: dict = None,
debug: bool = False,
) -> Union[xr.Dataset, xr.DataArray]:
"""_summary_
Parameters
----------
ds : Union[xr.Dataset, xr.DataArray]
_description_
region : str
_description_
data_var : str, optional
_description_, by default None
regions_specs : dict, optional
_description_, by default None
debug: bool, optional
Turn on debug print, by default False
Returns
-------
Union[xr.Dataset, xr.DataArray]
_description_
"""
if isinstance(ds, xr.DataArray):
is_dataArray = True
ds = da_to_ds(ds, data_var)
else:
is_dataArray = False

if regions_specs is None:
regions_specs = load_regions_specs()

if "domain" in regions_specs[region]:
if "latitude" in regions_specs[region]["domain"]:
lat0 = regions_specs[region]["domain"]["latitude"][0]
lat1 = regions_specs[region]["domain"]["latitude"][1]
# proceed subset
ds = select_subset(ds, lat=(min(lat0, lat1), max(lat0, lat1)))
if debug:
print("region_subset, latitude subsetted, ds:", ds)

if "longitude" in regions_specs[region]["domain"]:
lon0 = regions_specs[region]["domain"]["longitude"][0]
lon1 = regions_specs[region]["domain"]["longitude"][1]

# check original dataset longitude range
lon_min = get_longitude(ds).min().values.item()
lon_max = get_longitude(ds).max().values.item()

# Check if longitude range swap is needed
if min(lon0, lon1) < 0:
# when subset region lon is defined in (-180, 180) range
if min(lon_min, lon_max) < 0:
# if original data lon range is (-180, 180), no treatment needed
pass
else:
# if original data lon range is (0, 360), convert and swap lon
ds = xc.swap_lon_axis(ds, to=(-180, 180))

# proceed subset
# ds = select_subset(ds, lon=(min(lon0, lon1), max(lon0, lon1)))
ds = select_subset(ds, lon=(lon0, lon1))
if debug:
print("region_subset, longitude subsetted, ds:", ds)

if (region is None) or (
(region is not None) and (region not in list(regions_specs.keys()))
):
print("Error: region not defined")
# return the same type
if is_dataArray:
return ds[data_var]
else:
if "domain" in list(regions_specs[region].keys()):
if "latitude" in list(regions_specs[region]["domain"].keys()):
lat0 = regions_specs[region]["domain"]["latitude"][0]
lat1 = regions_specs[region]["domain"]["latitude"][1]
# proceed subset
if "latitude" in (ds.coords.dims):
ds = ds.sel(latitude=slice(lat0, lat1))
elif "lat" in (ds.coords.dims):
ds = ds.sel(lat=slice(lat0, lat1))

if "longitude" in list(regions_specs[region]["domain"].keys()):
lon0 = regions_specs[region]["domain"]["longitude"][0]
lon1 = regions_specs[region]["domain"]["longitude"][1]

# check original dataset longitude range
if "longitude" in (ds.coords.dims):
lon_min = ds.longitude.min()
lon_max = ds.longitude.max()
elif "lon" in (ds.coords.dims):
lon_min = ds.lon.min()
lon_max = ds.lon.max()

# longitude range swap if needed
if (
min(lon0, lon1) < 0
): # when subset region lon is defined in (-180, 180) range
if (
min(lon_min, lon_max) < 0
): # if original data lon range is (-180, 180) no treatment needed
pass
else: # if original data lon range is (0, 360), convert swap lon
ds = xc.swap_lon_axis(ds, to=(-180, 180))

# proceed subset
if "longitude" in (ds.coords.dims):
ds = ds.sel(longitude=slice(lon0, lon1))
elif "lon" in (ds.coords.dims):
ds = ds.sel(lon=slice(lon0, lon1))

return ds
return ds
Loading

0 comments on commit a0d50e4

Please sign in to comment.