Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[develop] Added an option for RRFS external model files used as ICS and LBCS #1089

Open
wants to merge 28 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 23 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
30171b7
Added an option for RRFS external model files as ICS and LBCS
May 30, 2024
dfe4070
update UFS_UTILS tag
Jun 6, 2024
17aacde
additions for RRFS capability
Jun 7, 2024
ddf3e9d
changes for RRFS capability
Jun 7, 2024
dc12654
Merge branch 'rrfs_ics_lbcs' into feature/rrfs
Jun 7, 2024
6124c68
Merge branch 'develop' into rrfs_ics_lbcs
Jun 12, 2024
bd2b96b
Revert to existing tag to checkout UFS_UTILS
Jun 12, 2024
a05b2bd
allow RRFS files as ICS/LBCS in UFS_UTILS submodule
Jun 13, 2024
09f931d
add RRFS expected data source for ICS/LBCS, in parm/data_locations.ym…
Jun 13, 2024
efafe30
update plotting scripts, geographical data overlaid onto the plotted …
Jun 13, 2024
225a83b
update parm/wflow/plot.yaml to allow plotting tasks for ensemble fore…
Jun 18, 2024
659d9a2
Update plot.yaml to allow ensemble members
natalie-perlin Jun 20, 2024
5870169
adding a new test grid_RRFS_CONUScompact_25km_ics_RRFS_lbcs_RRFS_suit…
Jun 20, 2024
25f09a7
Updating documentation for RRFS ICS/LBCS capability
Jun 20, 2024
c4fbcf4
Merge remote-tracking branch 'origin/rrfs_ics_lbcs' into rrfs_ics_lbcs
Jun 20, 2024
ea5fb8c
Updated description of a new test for RRFS functionality
natalie-perlin Jun 24, 2024
0f55a67
Update config.grid_RRFS_CONUScompact_25km_ics_RRFS_lbcs_RRFS_suite_RR…
natalie-perlin Jun 24, 2024
dd0f33c
Update plot.yaml
natalie-perlin Jun 24, 2024
58beeed
Update config.grid_RRFS_CONUScompact_25km_ics_RRFS_lbcs_RRFS_suite_RR…
natalie-perlin Jun 24, 2024
05686a8
Update retrieve_data.py
natalie-perlin Jun 24, 2024
cb201b4
Update devbuild.sh
natalie-perlin Jun 24, 2024
8a7df4a
Merge branch 'develop' into rrfs_ics_lbcs
Jun 24, 2024
e574095
reduce ensemble size and forecast period for a new test with RRFS ICS…
Jun 25, 2024
571c3a6
Update doc/UsersGuide/CustomizingTheWorkflow/InputOutputFiles.rst
natalie-perlin Jun 25, 2024
241aac6
Update doc/UsersGuide/BackgroundInfo/Components.rst
natalie-perlin Jun 25, 2024
360aed9
Update InputOutputFiles.rst
natalie-perlin Jun 25, 2024
1824cec
Updates devclean.sh with safety checks for removing directories
Jun 26, 2024
a915b10
devclean.sh update
Jun 27, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions devbuild.sh
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,18 @@ if [ "${DEFAULT_BUILD}" = true ]; then
BUILD_UPP="on"
fi

# Allow the use of RRFS model output files remapped into CONUS 3-km grid as ICS/LBCS,
# before UFS_UTILS integrates use of native/full RRFS files
# Files could be retrieved from
# https://noaa-rrfs-pds.s3.amazonaws.com/rrfs_a/rrfs_a.{yyyymmdd}/{hh}/control/
# in the format rrfs.t{hh}z.prslev.f{fcst_hr:03d}.conus_3km.grib2
if [ "${BUILD_UFS_UTILS}" = "on" ]; then
os=$(uname) && SED=sed
test $os == Darwin && ( os=MacOSX && SED=gsed )
echo "SED is ${SED}"
CHGRES_CUBE=${SRW_DIR}/sorc/UFS_UTILS/sorc/chgres_cube.fd
${SED} -i 's/"RAP","HRRR"/"RAP","HRRR","RRFS"/g' ${CHGRES_CUBE}/program_setup.F90
fi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not an appropriate fix to fortran codes.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a temporary workaround to allow chgres_cube to build before a later tag is used that officially introduces RRFS capability. The current UFS_UTILS tag simply flag the "RRFS" option as not valid, and reports an error. Newer UFS_UTILS tags do not yet work with the SRW, and require more adaptation.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

@natalie-perlin natalie-perlin Jun 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed an extra empty line

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The appropriate workaround to needing code not yet committed to an authoritative repo is to keep the necessary code in a branch and point to that branch in the interim. Editing external code in the build process is a bad practice that sets a horrible precedent.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updating the UFS_UTILS hash to 1dac855 would bring in the necessary change to chgres_cube's program_setup.F90. However, all WE2E tests that use RAP and HRRR physics will fail while moving to this version of UFS_UTILS, as it is after the fractional grid update. It doesn't look like there has been any work on issue #961 in UFS_UTILS, so this is still keeping the SRW App from moving forward with updated versions of UFS_UTILS.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I totally understand that there is no way to get the UFS_UTILS hash updated right now.

There are two ways to handle it -- a temporary branch in a fork that EPIC controls to hold this source code change, or holding off on this PR until that update can be made.

Editing source code in the build script is unacceptable.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@christinaholtNOAA @MichaelLueken

Another possible approach could be to solve the issue of chgres_cube flagging the RRFS option to be done on SRW scripts level, instead of changing the error condition on the UFS_UTILS level.

My suggestion is to change the variable "external_model" from "RRFS" to "HRRR" in namelist fort.41 files before chgres_cube is called. These namelist files are stored in $EXPTDIR/[TEST_NAME]/[YYYYMMDDHH]/tmp_MAKE_ICS/ and ./tmp_MAKE_LBCS/ .
The changes are to be done in exregional_make_ics.sh and exregional_make_lbcs.sh, or in exregional_get_extrn_mdl_files.sh

That way the RRFS files interpolated to a regular grid still could be retrieved from the RRFS AWS location as needed, according to the external model type set for the experiment, but will be processed by chgres_cube similar to those for HRRR. This should be functional as long as HRRR and RAP remain as options for the SRW initial and lateral boundary condition.

Please let me know if this sounds a viable alternative to error flag removal in chgres_cube's program_setup.F90!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Labeling RRFS ICs/LBCs as "HRRR" to get it through is fine with me. Modifying source code is not.

# Choose components to build for air quality modeling (Online-CMAQ)
if [ "${APPLICATION}" = "ATMAQ" ]; then
if [ "${DEFAULT_BUILD}" = true ]; then
Expand Down
2 changes: 1 addition & 1 deletion doc/UsersGuide/BackgroundInfo/Components.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ UFS Preprocessing Utilities (UFS_UTILS)

The SRW Application includes a number of pre-processing utilities (UFS_UTILS) that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), these utilities generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3` limited area model (:term:`LAM`). Additional information about the UFS pre-processing utilities can be found in the :doc:`UFS_UTILS Technical Documentation <ufs-utils:index>` and in the `UFS_UTILS Scientific Documentation <https://ufs-community.github.io/UFS_UTILS/index.html>`__.

The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.
The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), High-Resolution Rapid Refresh (:term:`HRRR`), Rapid Refresh Forecast System (:term:`RRFS`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.
natalie-perlin marked this conversation as resolved.
Show resolved Hide resolved

.. WARNING::
For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `NOAA Operational Model Archive and Distribution System <https://nomads.ncep.noaa.gov/>`__ (NOMADS). Raw external model data may be pre-staged on disk by the user.
Expand Down
5 changes: 3 additions & 2 deletions doc/UsersGuide/BuildingRunningTesting/RunSRW.rst
Original file line number Diff line number Diff line change
Expand Up @@ -547,8 +547,9 @@ The ``data:`` section of the machine file can point to various data sources that
nemsio: /Users/username/DATA/UFS/FV3GFS/nemsio
grib2: /Users/username/DATA/UFS/FV3GFS/grib2
netcdf: /Users/username/DATA/UFS/FV3GFS/netcdf
RAP: /Users/username/DATA/UFS/RAP/grib2
HRRR: /Users/username/DATA/UFS/HRRR/grib2
RAP: /Users/username/DATA/UFS/RAP
HRRR: /Users/username/DATA/UFS/HRRR
RRFS: /Users/username/DATA/UFS/RRFS

This can be helpful when conducting multiple experiments with different types of data.

Expand Down
1 change: 1 addition & 0 deletions doc/UsersGuide/BuildingRunningTesting/WE2Etests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ For convenience, the WE2E tests are currently grouped into the following categor
FV3GFS:
RAP:
HRRR:
RRFS:

Some tests are duplicated among the above categories via symbolic links, both for legacy reasons (when tests for different capabilities were consolidated) and for convenience when a user would like to run all tests for a specific category (e.g., verification tests).

Expand Down
4 changes: 2 additions & 2 deletions doc/UsersGuide/CustomizingTheWorkflow/ConfigWorkflow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -912,7 +912,7 @@ Basic Task Parameters
For each workflow task, certain parameter values must be passed to the job scheduler (e.g., Slurm), which submits a job for the task.

``EXTRN_MDL_NAME_ICS``: (Default: "FV3GFS")
The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"``
The name of the external model that will provide fields from which initial condition (IC) files, surface files, and 0-th hour boundary condition files will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"RRFS"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"``

``EXTRN_MDL_ICS_OFFSET_HRS``: (Default: 0)
Users may wish to start a forecast using forecast data from a previous cycle of an external model. This variable indicates how many hours earlier the external model started than the FV3 forecast configured here. For example, if the forecast should start from a 6-hour forecast of the GFS, then ``EXTRN_MDL_ICS_OFFSET_HRS: "6"``.
Expand Down Expand Up @@ -966,7 +966,7 @@ Basic Task Parameters
For each workflow task, certain parameter values must be passed to the job scheduler (e.g., Slurm), which submits a job for the task.

``EXTRN_MDL_NAME_LBCS``: (Default: "FV3GFS")
The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"``
The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. Valid values: ``"GSMGFS"`` | ``"FV3GFS"`` | ``"GEFS"`` | ``"GDAS"`` | ``"RAP"`` | ``"HRRR"`` | ``"RRFS"`` | ``"NAM"`` | ``"UFS-CASE-STUDY"``

``LBC_SPEC_INTVL_HRS``: (Default: 6)
The interval (in integer hours) at which LBC files will be generated. This is also referred to as the *boundary update interval*. Note that the model selected in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to "6", then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case.
Expand Down
12 changes: 8 additions & 4 deletions doc/UsersGuide/CustomizingTheWorkflow/InputOutputFiles.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ ways, including:

* Pulled from the `SRW App Data Bucket <https://registry.opendata.aws/noaa-ufs-shortrangeweather/>`__,
* Pulled from the NOAA High Performance Storage System (:term:`HPSS`) during the workflow execution (requires user access), or
* Obtained and staged by the user from a different source.
* Obtained and staged by the user from a different source,
natalie-perlin marked this conversation as resolved.
Show resolved Hide resolved
* Pulled from the `NOAA-RRFS AWS S3 bucket <https://noaa-rrfs-pds.s3.amazonaws.com/index.html#rrfs_a/>`__, a description could be found in `NOAA Rapid Refresh Forecast System (RRFS) <https://registry.opendata.aws/noaa-rrfs/>`__.

The data format for these files can be :term:`GRIB2` or :term:`NEMSIO`. More information on downloading and setting up the external model data can be found in :numref:`Section %s <DownloadingStagingInput>`. Once the data is set up, the end-to-end application will run the system and write output files to disk.

Expand Down Expand Up @@ -246,7 +247,7 @@ The environment variables ``FIXgsm``, ``FIXorg``, and ``FIXsfc`` indicate the pa

Initial Condition/Lateral Boundary Condition File Formats and Source
-----------------------------------------------------------------------
The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, :term:`netCDF`, or :term:`GRIB2`.
The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR, RRFS). The data can be provided in three formats: :term:`NEMSIO`, :term:`netCDF`, or :term:`GRIB2`.

To download the model input data for the 12-hour "out-of-the-box" experiment configuration in ``config.community.yaml`` file, run:

Expand All @@ -273,7 +274,7 @@ The paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBC
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/ufs-srweather-app/input_model_data/FV3GFS/grib2/YYYYMMDDHH

The two ``EXTRN_MDL_SOURCE_BASEDIR_*CS`` variables describe where the :term:`IC <ICs>` and :term:`LBC <LBCs>` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC <ICs/LBCs>` file paths to include the model name (e.g., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR), data format (e.g., grib2, nemsio), and date (in ``YYYYMMDDHH`` format). For example: ``/path/to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow.
The two ``EXTRN_MDL_SOURCE_BASEDIR_*CS`` variables describe where the :term:`IC <ICs>` and :term:`LBC <LBCs>` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC <ICs/LBCs>` file paths to include the model name (e.g., FV3GFS, GEFS, GDAS, NAM, RAP, HRRR,RRFS), data format for (e.g., grib2, nemsio, netcdf), and date (in ``YYYYMMDDHH`` format). For example: ``/path/to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow.
natalie-perlin marked this conversation as resolved.
Show resolved Hide resolved

When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the data bucket), the naming convention looks something like this:

Expand All @@ -290,11 +291,12 @@ When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the da

* RAP (GRIB2): ``rap.t{cycle}z.wrfprsf{fhr}.grib2``
* HRRR (GRIB2): ``hrrr.t{cycle}z.wrfprsf{fhr}.grib2``
* RRFS (GRIB2): ``rrfs.t{cycle}z.prslev.f{fhr}.conus.grib2``

where:

* ``{cycle}`` corresponds to the 2-digit hour of the day when the forecast cycle starts, and
* ``{fhr}`` corresponds to the 2- or 3-digit nth hour of the forecast (3-digits for FV3GFS/GDAS data and 2 digits for RAP/HRRR data).
* ``{fhr}`` corresponds to the 2- or 3-digit nth hour of the forecast (3-digits for FV3GFS/GDAS/RRFS data and 2 digits for RAP/HRRR data).

For example, a forecast using FV3GFS GRIB2 data that starts at 18h00 UTC would have a ``{cycle}`` value of 18, which is the 000th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f003``.

Expand Down Expand Up @@ -353,6 +355,8 @@ AWS S3 Data Buckets:
* GDAS: https://registry.opendata.aws/noaa-gfs-bdp-pds/
* HRRR: https://registry.opendata.aws/noaa-hrrr-pds/ (necessary fields for initializing available for dates 2015 and newer)
* A list of the NOAA Open Data Dissemination (NODD) datasets can be found here: https://www.noaa.gov/nodd/datasets
* RRFS - experimental data is available starting 02/01/2024 for deteministic forecasts out to 60 hours at 00, 06, 12, and 18 UTC, and out to 18 hours for other cycles. Earlier dates, from 05/01/2023 to 01/31/2024, may contain only forecasts at 00, 06, 12, 18 UTC; user needs to verify that data exist for needed dates.
https://noaa-rrfs-pds.s3.amazonaws.com/index.html#rrfs_a/

NCEI Archive:

Expand Down
2 changes: 1 addition & 1 deletion doc/UsersGuide/CustomizingTheWorkflow/LAMGrids.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ The 3-km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite
The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s <RRFS_CONUS_3km>` (in red), and the boundary of the :ref:`write component grid <WriteComp>` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for the :term:`UPP` to read in and correctly process the data.

.. note::
While it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data (such as HRRR or RAP data) that has a resolution similar to that of the native FV3-LAM (predefined) grid.
While it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data (such as HRRR, RRFS or RAP data) that has a resolution similar to that of the native FV3-LAM (predefined) grid.


Predefined SUBCONUS Grid Over Indianapolis
Expand Down
3 changes: 2 additions & 1 deletion doc/UsersGuide/Reference/Glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,8 @@ Glossary
A central location in which files (e.g., data, code, documentation) are stored and managed.

RRFS
The `Rapid Refresh Forecast System <https://gsl.noaa.gov/focus-areas/unified_forecast_system/rrfs>`__ (RRFS) is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain.
The `Rapid Refresh Forecast System <https://gsl.noaa.gov/focus-areas/unified_forecast_system/rrfs>`__ (RRFS) is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain, see also `NOAA Rapid Refresh Forecast System (RRFS) <https://registry.opendata.aws/noaa-rrfs/>`__. Experimental data for is currently available from the `AWS S3 NOAA-RRFS <https://noaa-rrfs-pds.s3.amazonaws.com/index.html#rrfs_a/>`__ bucket starting 02/01/2024 for deteministic forecasts out to 60 hours at 00, 06, 12, and 18 UTC, and out to 18 hours for other cycles. Earlier dates, from 05/01/2023 to 01/31/2024, may contain only forecasts at 00, 06, 12, 18 UTC; user needs to verify that data exist for needed dates.


SDF
Suite Definition File. An external file containing information about the construction of a physics suite. It describes the schemes that are called, in which order they are called, whether they are subcycled, and whether they are assembled into groups to be called together.
Expand Down
14 changes: 14 additions & 0 deletions parm/data_locations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,20 @@ RAP:
file_names:
<<: *rap_file_names

RRFS:
hpss:
protocol: htar
christinaholtNOAA marked this conversation as resolved.
Show resolved Hide resolved
file_names: &rrfs_file_names
anl:
- rrfs.t{hh}z.prslev.f{fcst_hr:03d}.conus.grib2
fcst:
- rrfs.t{hh}z.prslev.f{fcst_hr:03d}.conus.grib2
natalie-perlin marked this conversation as resolved.
Show resolved Hide resolved
aws:
protocol: download
url: https://noaa-rrfs-pds.s3.amazonaws.com/rrfs_a/rrfs_a.{yyyymmdd}/{hh}/control/
file_names:
<<: *rrfs_file_names

HRRR:
hpss:
protocol: htar
Expand Down
54 changes: 31 additions & 23 deletions parm/wflow/plot.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,37 +12,45 @@ default_task_plot: &default_task
PDY: !cycstr "@Y@m@d"
cyc: !cycstr "@H"
subcyc: !cycstr "@M"
fhr: '#fhr#'
LOGDIR: !cycstr "&LOGDIR;"
SLASH_ENSMEM_SUBDIR: '&SLASH_ENSMEM_SUBDIR;'
ENSMEM_INDX: '#mem#'
nprocs: '{{ nnodes * ppn }}'
nprocs: '{{ parent.nnodes * parent.ppn }}'
join: !cycstr '&LOGDIR;/{{ jobname }}_@Y@m@d@H&LOGEXT;'
native: '{{ platform.SCHED_NATIVE_CMD }}'
nnodes: 1
nnodes: 2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the plotting script MPI capable? If not, this is wasting resources that will never be used.

nodes: '{{ nnodes }}:ppn={{ ppn }}'
partition: '{% if platform.get("PARTITION_DEFAULT") %}&PARTITION_DEFAULT;{% else %}None{% endif %}'
natalie-perlin marked this conversation as resolved.
Show resolved Hide resolved
ppn: 24
queue: '&QUEUE_DEFAULT;'
walltime: 01:00:00

task_plot_allvars:
<<: *default_task
command: '&LOAD_MODULES_RUN_TASK_FP; "plot_allvars" "&JOBSdir;/JREGIONAL_PLOT_ALLVARS"'
join: !cycstr '&LOGDIR;/{{ jobname }}_@Y@m@d@H&LOGEXT;'
dependency:
or_do_post: &post_files_exist
and_run_post: # If post was meant to run, wait on the whole post metatask
taskvalid:
attrs:
task: run_post_mem000_f000
metataskdep:
attrs:
metatask: run_ens_post
and_inline_post: # If inline post ran, wait on the forecast task to complete
not:
taskvalid:
attrs:
task: run_post_mem000_f000
taskdep:
attrs:
task: run_fcst_mem000
metatask_plot_allvars:
var:
mem: '{% if global.DO_ENSEMBLE %}{%- for m in range(1, global.NUM_ENS_MEMBERS+1) -%}{{ "%03d "%m }}{%- endfor -%} {% else %}{{ "000"|string }}{% endif %}'
metatask_plot_allvars_mem#mem#_all_fhrs:
var:
fhr: '{% for h in range(0, workflow.LONG_FCST_LEN+1) %}{{ " %03d" % h }}{% endfor %}'
cycledef: '{% for h in range(0, workflow.LONG_FCST_LEN+1) %}{% if h <= workflow.FCST_LEN_CYCL|min %}forecast {% else %}long_forecast {% endif %}{% endfor %}'
task_plot_allvars_mem#mem#_f#fhr#:
<<: *default_task
command: '&LOAD_MODULES_RUN_TASK_FP; "plot_allvars" "&JOBSdir;/JREGIONAL_PLOT_ALLVARS"'
dependency:
or_do_post: &post_files_exist
and_run_post: # If post was meant to run, wait on the whole post metatask
taskvalid:
attrs:
task: run_post_mem#mem#_f#fhr#
metataskdep:
attrs:
metatask: run_ens_post
and_inline_post: # If inline post ran, wait on the forecast task to complete
not:
taskvalid:
attrs:
task: run_post_mem#mem#_f#fhr#
taskdep:
attrs:
task: run_post_mem#mem#_f#fhr#

Loading
Loading