diff --git a/README.md b/README.md
index 08af7365..b5003f38 100644
--- a/README.md
+++ b/README.md
@@ -2,15 +2,15 @@
The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA's operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufs.epic.noaa.gov/.
-The UFS includes [multiple applications](https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This repository hosts the source code for the UFS Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the UFS Weather Model (WM). Its data assimilation framework uses the Joint Effort for Data assimilation Integration (JEDI) software stack, which includes the Object-Oriented Prediction System (OOPS) for the data assimilation algorithm, the Interface for Observation Data Access (IODA) for observation formatting and processing, and the Unified Forward Operator (UFO) for comparing model forecasts and observations.
+The UFS includes [multiple applications](https://ufs.epic.noaa.gov/applications/) that support different forecast durations and spatial domains. This repository hosts the source code for the UFS Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the UFS Weather Model (WM). Its data assimilation framework uses the Joint Effort for Data assimilation Integration (JEDI) software stack, which includes the Object-Oriented Prediction System (OOPS) for the data assimilation algorithm, the Interface for Observation Data Access (IODA) for observation formatting and processing, and the Unified Forward Operator (UFO) for comparing model forecasts and observations.
The offline Noah-MP LSM is a standalone, uncoupled model used to execute land surface simulations. In this traditional uncoupled mode, near-surface atmospheric forcing data is required as input forcing. This LSM simulates soil moisture (both liquid and frozen), soil temperature, skin temperature, snow depth, snow water equivalent (SWE), snow density, canopy water content, and the energy flux and water flux terms of the surface energy balance and surface water balance. Its data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (Hunt et al., 2007).
The Noah-MP LSM has evolved through community efforts to pursue and refine a modern-era LSM suitable for use in the National Centers for Environmental Prediction (NCEP) operational weather and climate prediction models. This collaborative effort continues with participation from entities such as NCAR, NCEP, NASA, and university groups. The development branch of the Land DA System is continually evolving as the system undergoes open development. The latest Land DA release (v1.2.0) represents a snapshot of this continuously evolving system.
-The Land DA System User's Guide associated with the development branch is at: https://land-da-workflow.readthedocs.io/en/develop/, while the guide specific to the Land DA v1.2.0 release can be found at: https://land-da-workflow.readthedocs.io/en/release-public-v1.2.0/. Users may download data for use with the most recent release from the [Land DA data bucket](https://registry.opendata.aws/noaa-ufs-land-da/). The [Land DA Docker Hub](https://hub.docker.com/r/noaaepic/ubuntu20.04-intel-landda) hosts Land DA containers. These containers package the Land DA System together with all its software dependencies for an easier experience building and running Land DA.
+The Land DA System User's Guide associated with the development branch is at: https://land-da-workflow.readthedocs.io/en/develop/, while the guide specific to the Land DA v2.0.0 release can be found at: https://land-da-workflow.readthedocs.io/en/release-public-v2.0.0/. Users may download data for use with the most recent release from the [Land DA data bucket](https://registry.opendata.aws/noaa-ufs-land-da/). The [Land DA Docker Hub](https://hub.docker.com/r/noaaepic/ubuntu22.04-intel21.10-landda) hosts Land DA containers. These containers package the Land DA System together with all its software dependencies for an easier experience building and running Land DA.
For any publications based on work with the UFS Offline Land Data Assimilation System, please include a citation to the DOI below:
-UFS Development Team. (2023, Dec. 11). Unified Forecast System (UFS) Land Data Assimilation (DA) System (Version v1.2.0). Zenodo. https://doi.org/10.5281/zenodo.7675721
+UFS Development Team. (2024, October 30). Unified Forecast System (UFS) Land Data Assimilation (DA) System (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.13909475
diff --git a/doc/Makefile b/doc/Makefile
index 7f122731..337aaf2d 100644
--- a/doc/Makefile
+++ b/doc/Makefile
@@ -15,9 +15,6 @@ help:
.PHONY: help Makefile linkcheck
-help:
- @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS)
-
doc:
make clean
$(MAKE) linkcheck
diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst
index 6bfe1c90..d35b3c0f 100644
--- a/doc/source/BackgroundInfo/Introduction.rst
+++ b/doc/source/BackgroundInfo/Introduction.rst
@@ -9,22 +9,32 @@ This User's Guide provides guidance for running the Unified Forecast System
the Joint Effort for Data assimilation Integration (:term:`JEDI`) software. Currently, the offline UFS Land DA System only works with snow data.
Thus, this User's Guide focuses primarily on the snow DA process.
-Since the |latestr| release, the following capabilities have been added to the Land DA System:
+The following improvements were made to the Land DA System ahead of the |latestr| release:
* Added cycled run capability (:land-wflow-repo:`PR #101 `)
-* Provide automated run option using cron (:land-wflow-repo:`PR #110 `)
-* Added analysis plotting task (:land-wflow-repo:`PR #107 `)
+* Provided automated run option using cron (:land-wflow-repo:`PR #110 `)
+* Incorporated `Unified Workflow Tools `_:
+
+ * Added Rocoto tool to produce the Rocoto workflow XML file from a YAML configuration file (:land-wflow-repo:`PR #47 `)
+ * Added template tool to render a configuration file from a template (:land-wflow-repo:`PR #153 `)
+* Added plotting options:
+
+ * Analysis plotting task (:land-wflow-repo:`PR #107 `)
+ * Plotting option for forecast task restart files (:land-wflow-repo:`PR #149 `)
+ * Time-history plots (:land-wflow-repo:`PR #151 `)
+* Extended and updated container support (:land-wflow-repo:`PR #85 ` and :land-wflow-repo:`PR #147 `)
+* Ported ``land-DA_workflow`` to Hercules (:land-wflow-repo:`PR #133 `)
+* Added prerequisites for workflow end-to-end (WE2E) testing capability (:land-wflow-repo:`PR #131 `)
* Upgraded to JEDI Skylab v7.0 (:land-wflow-repo:`PR #92 `)
* Upgraded to spack-stack v1.6.0 (:land-wflow-repo:`PR #102 `)
-* Extended container support (:land-wflow-repo:`PR #85 `)
-* Updated directory structure for NCO compliance (:land-wflow-repo:`PR #75 `)
+* Updated directory structure for NCO compliance (e.g., :land-wflow-repo:`PR #75 `)
+* Added platform test to CTest & updated version of UFS WM (:land-wflow-repo:`PR #146 `)
* Removed land driver from CTest (:land-wflow-repo:`PR #123 `)
-* Removed land-driver and vector2tile (:land-wflow-repo:`PR #129 `)
+* Removed land driver and vector2tile (:land-wflow-repo:`PR #129 `)
The Land DA System citation is as follows and should be used when presenting results based on research conducted with the Land DA System:
-UFS Development Team. (2023, December 11). Unified Forecast System (UFS) Land Data Assimilation (DA) System (Version v1.2.0). Zenodo. https://doi.org/10.5281/zenodo.7675721
-
+UFS Development Team. (2024, October 30). Unified Forecast System (UFS) Land Data Assimilation (DA) System (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.13909475
Organization
**************
@@ -39,9 +49,9 @@ Background Information
Building, Running, and Testing the Land DA System
===================================================
- * :numref:`Chapter %s: Land DA Workflow ` explains how to build and run the Land DA System on :ref:`Level 1 ` systems (currently Hera and Orion).
+ * :numref:`Chapter %s: Land DA Workflow ` explains how to build and run the Land DA System on :ref:`Level 1 ` systems (currently Hera, Orion, and Hercules).
* :numref:`Chapter %s: Containerized Land DA Workflow ` explains how to build and run the containerized Land DA System on non-Level 1 systems.
- * :numref:`Chapter %s: Testing the Land DA Workflow ` explains how to run the Land DA CTests.
+ * :numref:`Chapter %s: Testing the Land DA Workflow ` explains how to run Land DA System tests.
Customizing the Workflow
=========================
@@ -63,11 +73,11 @@ User Support and Documentation
Questions
==========
-The Land DA System's `GitHub Discussions `__ forum provides online support for UFS users and developers to post questions and exchange information. When users encounter difficulties running the Land DA System, this is the place to post. Users can expect an initial response within two business days.
+The Land DA System's `GitHub Discussions `_ forum provides online support for UFS users and developers to post questions and exchange information. When users encounter difficulties running the Land DA System, this is the place to post. Users can expect an initial response within two business days.
When posting a question, it is recommended that users provide the following information:
-* The platform or system being used (e.g., Hera, Orion, container, MacOS, Linux)
+* The platform or system being used (e.g., Hera, Orion, container)
* The version of the Land DA System being used (e.g., ``develop``, ``release/public-v1.1.0``). (To determine this, users can run ``git branch``, and the name of the branch with an asterisk ``*`` in front of it is the name of the branch or tag they are working with.) Note that the Land DA version being used and the version of the documentation being used should match, or users will run into difficulties.
* Stage of the application when the issue appeared (i.e., build/compilation, configuration, or forecast run)
* Contents of relevant configuration files
@@ -86,8 +96,8 @@ Feature Requests and Enhancements
Users who want to request a feature enhancement or the addition of a new feature have a few options:
- #. File a `GitHub Issue `__ and add (or request that a code manager add) the ``EPIC Support Requested`` label.
- #. Post a request for a feature or enhancement in the `Enhancements `__ category of GitHub Discussions. These feature requests will be forwarded to the Earth Prediction Innovation Center (`EPIC `__) management team for prioritization and eventual addition to the Land DA System.
+ #. File a `GitHub Issue `_ and add (or request that a code manager add) the ``EPIC Support Requested`` label.
+ #. Post a request for a feature or enhancement in the `Enhancements `_ category of GitHub Discussions. These feature requests will be forwarded to the Earth Prediction Innovation Center (`EPIC `_) management team for prioritization and eventual addition to the Land DA System.
#. Email the request to support.epic@noaa.gov.
@@ -99,10 +109,10 @@ Background Information
Unified Forecast System (UFS)
===============================
-The UFS is a community-based, coupled, comprehensive Earth modeling system. It includes `multiple applications `__ that support different forecast durations and spatial domains. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from many different modeling systems.
+The UFS is a community-based, coupled, comprehensive Earth modeling system. It includes :ufs:`multiple applications ` that support different forecast durations and spatial domains. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from many different modeling systems.
The UFS is designed to enable research, development, and contribution
opportunities within the broader :term:`Weather Enterprise` (including
-government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__.
+government, industry, and academia). For more information about the UFS, visit the :ufs:`UFS Portal <>`.
.. _NoahMP:
@@ -112,14 +122,16 @@ Noah-MP
The offline Noah-MP LSM is a stand-alone, uncoupled model used to execute land surface simulations. In this traditional uncoupled mode, near-surface atmospheric :term:`forcing data` are required as input forcing. This LSM simulates soil moisture (both liquid and frozen), soil temperature, skin temperature, snow depth, snow water equivalent (SWE), snow density, canopy water content, and the energy flux and water flux terms of the surface energy balance and surface water balance.
-Noah-MP uses a big-leaf approach with a separated vegetation canopy accounting
-for vegetation effects on surface energy and water balances, a modified two-stream
-approximation scheme to include the effects of vegetation canopy gaps that vary
-with solar zenith angle and the canopy 3-D structure on radiation transfer,
-a 3-layer physically-based snow model, a more permeable frozen soil by separating
-a grid cell into a permeable fraction and impermeable fraction, a simple
-groundwater model with a TOPMODEL-based runoff scheme, and a short-term leaf
-phenology model. Noah-MP LSM enables a modular framework for diagnosing differences
+Noah-MP uses:
+
+* a big-leaf approach with a separated vegetation canopy accounting for vegetation effects on surface energy and water balances,
+* a modified two-stream approximation scheme to include the effects of vegetation canopy gaps that vary with solar zenith angle and the canopy 3-D structure on radiation transfer,
+* a 3-layer physically-based snow model
+* a more permeable frozen soil by separating a grid cell into a permeable fraction and impermeable fraction,
+* a simple groundwater model with a TOPMODEL-based runoff scheme, and
+* a short-term leaf phenology model.
+
+Noah-MP LSM enables a modular framework for diagnosing differences
in process representation, facilitating ensemble forecasts and uncertainty
quantification, and choosing process presentations appropriate for the application.
Noah-MP developers designed multiple parameterization options for leaf dynamics,
@@ -127,7 +139,7 @@ radiation transfer, stomatal resistance, soil moisture stress factor for stomata
resistance, aerodynamic resistance, runoff, snowfall, snow surface albedo,
supercooled liquid water in frozen soil, and frozen soil permeability.
-The Noah-MP LSM has evolved through community efforts to pursue and refine a modern-era LSM suitable for use in the National Centers for Environmental Prediction (NCEP) operational weather and climate prediction models. This collaborative effort continues with participation from entities such as NCAR, NCEP, NASA, and university groups.
+The Noah-MP LSM has evolved through community efforts to pursue and refine a modern-era LSM suitable for use in the National Centers for Environmental Prediction (:term:`NCEP`) operational weather and climate prediction models. This collaborative effort continues with participation from entities such as NCAR, NCEP, NASA, and university groups.
Noah-MP has been implemented in the UFS via the :term:`CCPP` physics package and
is currently being tested for operational use in GFSv17 and RRFS v2. Additionally, the UFS Weather Model now contains a Noah-MP land component. Noah-MP has
diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst
index a23ffa76..2dc0fc3b 100644
--- a/doc/source/BackgroundInfo/TechnicalOverview.rst
+++ b/doc/source/BackgroundInfo/TechnicalOverview.rst
@@ -19,8 +19,8 @@ Minimum System Requirements
Additionally, users will need:
- * Disk space: ~23GB (11GB for Land DA System [or 6.5GB for Land DA container], 11GB for Land DA data, and ~1GB for staging and output)
- * 7 CPU cores (or option to run with "oversubscribe")
+ * Disk space: ~24GB (11GB for Land DA System [or 6.5GB for Land DA container], 12GB for Land DA data, and ~1GB for staging and output)
+ * 26 CPU cores (13 CPUs may be possible, but it has not been tested)
Software Prerequisites
========================
@@ -47,14 +47,14 @@ Supported Systems for Running Land DA
Four levels of support have been defined for :term:`UFS` applications, and the Land DA System operates under this paradigm:
-* **Level 1** *(Pre-configured)*: Prerequisite software libraries are pre-built and available in a central location; code builds; full testing of model.
+* **Level 1** *(Preconfigured)*: Prerequisite software libraries are pre-built and available in a central location; code builds; full testing of model.
* **Level 2** *(Configurable)*: Prerequisite libraries are not available in a centralized location but are expected to install successfully; code builds; full testing of model.
* **Level 3** *(Limited-test platforms)*: Libraries and code build on these systems, but there is limited testing of the model.
* **Level 4** *(Build-only platforms)*: Libraries and code build, but running the model is not tested.
Level 1 Systems
==================
-Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the :term:`jedi-bundle` (Skylab |skylabv|). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the :land-wflow-repo:`build and run version files ` for their machine of choice.
+Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the :term:`jedi-bundle` (|skylabv|). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the :land-wflow-repo:`build and run version files ` for their machine of choice.
.. _stack-compiler-locations:
@@ -70,18 +70,18 @@ Preconfigured (Level 1) systems for Land DA already have the required external l
* - Hera
- intel/2021.5.0
- impi/2021.5.1
- - /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core
+ - /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core
- /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7
* - Orion
- intel/2021.9.0
- impi/2021.9.0
- - /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core
+ - /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core
- /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6
* - Hercules
- intel/2021.9.0
- impi/2021.9.0
- - /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/modulefiles/Core
- - /work2/noaa/epic/UFS_Land-DA_Dev/jedi_v7_hercules
+ - /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/fms-2024.01/install/modulefiles/Core
+ - /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_hercules
* - Container
- intel-oneapi-compilers/2021.10.0
- intel-oneapi-mpi/2021.9.0
@@ -105,7 +105,7 @@ Hierarchical Repository Structure
The main repository for the Land DA System is named ``land-DA_workflow``;
it is available on GitHub at https://github.com/ufs-community/land-DA_workflow.
-This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` file to pull in the appropriate versions of external repositories associated with the Land DA System. :numref:`Table %s ` describes the various subrepositories that form the UFS Land DA System.
+This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` file to pull in code from the appropriate versions of external repositories associated with the Land DA System. :numref:`Table %s ` describes the various subrepositories that form the UFS Land DA System.
.. _LandDAComponents:
@@ -133,7 +133,7 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil
File & Directory Structure
============================
-The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operations (NCO) :nco:`WCOSS Implementation Standards `. When the ``develop`` branch of the ``land-DA_workflow`` repository is cloned with the ``--recursive`` argument, the specific GitHub repositories described in ``/sorc/app_build.sh`` are cloned into ``sorc``. The diagram below illustrates the file and directory structure of the Land DA System. Directories in parentheses () are only visible after the build step. Some files and directories have been removed for brevity.
+The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operations (NCO) :nco:`WCOSS Implementation Standards `. When the ``land-DA_workflow`` repository is cloned with the ``--recursive`` argument, the specific GitHub repositories described in ``/sorc/app_build.sh`` are cloned into ``sorc``. The diagram below illustrates the file and directory structure of the Land DA System. Directories in parentheses () are only visible after the build step. Some files and directories have been removed for brevity.
.. code-block:: console
@@ -145,31 +145,41 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio
├── (lib*)
├── modulefiles
├── parm
+ │ ├── jedi
+ │ ├── templates
+ │ │ └── template.land_analysis.yaml
│ ├── check_release_outputs.sh
- │ ├── land_analysis__.yaml
- │ └── run_without_rocoto.sh
+ │ ├── detect_platform.sh
+ │ ├── parm_xml_.yaml
+ │ └── launch_rocoto_wflow.sh
├── scripts
├── sorc
| ├── apply_incr.fd
- | | ├── apply_incr_noahmp_snow.f90
- | | └── NoahMPdisag_module.f90
+ | | └── sorc
+ | | ├── apply_incr_noahmp_snow.f90
+ | | └── NoahMPdisag_module.f90
│ ├── (build)
│ ├── cmake
- │ │ └── compiler_flags_*.cmake
│ ├── (conda)
+ | | └── envs
+ | | └── land_da
│ ├── test
+ │ │ ├── _ctest.sh
+ │ │ └── run__ctest.sh
│ ├── tile2tile_converter.fd
│ ├── ufs_model.fd
│ ├── CMakeLists.txt
│ └── app_build.sh
├── ush
+ | ├── fill_jinja_template.py
| ├── hofx_analysis_stats.py
- | └── letkf_create_ens.py
+ | ├── letkf_create_ens.py
+ | └── plot_forecast_restart.py
├── versions
├── LICENSE
└── README.md
-:numref:`Table %s ` describes the contents of the most important Land DA subdirectories. :numref:`Section %s ` describes the Land DA System components. Users can reference the :nco:`NCO Implementation Standards ` (p. 19) for additional details on repository structure in NCO-compliant repositories.
+:numref:`Table %s ` describes the contents of the most important Land DA subdirectories. :numref:`Section %s ` describes the Land DA System components. Users may reference the :nco:`NCO Implementation Standards ` (p. 19) for additional details on repository structure in NCO-compliant repositories.
.. _Subdirectories:
@@ -207,12 +217,9 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio
The UFS Land Component
=========================
-The UFS Land DA System has been updated to build the UFS Noah-MP land component as part of the build process.
-Updates allowing the Land DA System to run with the land component are underway.
-
-The land component makes use of a National Unified Operational Prediction Capability (:term:`NUOPC`) cap to interface with a coupled modeling system.
-Unlike the standalone Noah-MP land driver, the Noah-MP :term:`NUOPC cap` is able to create an :term:`ESMF` multi-tile grid by reading in a mosaic grid file. For the domain, the :term:`FMS` initializes reading and writing of the cubed-sphere tiled output. Then, the Noah-MP land component reads static information and initial conditions (e.g., surface albedo) and interpolates the data to the date of the simulation. The solar zenith angle is calculated based on the time information.
+The UFS Land DA System has been updated to build and run the UFS Noah-MP land component. The land component makes use of a National Unified Operational Prediction Capability (:term:`NUOPC`) cap to interface with a coupled modeling system.
+This Noah-MP :term:`NUOPC cap` is able to create an :term:`ESMF` multi-tile grid by reading in a mosaic grid file. For the domain, the :term:`FMS` initializes reading and writing of the cubed-sphere tiled output. Then, the Noah-MP land component reads static information and initial conditions (e.g., surface albedo) and interpolates the data to the date of the simulation. The solar zenith angle is calculated based on the time information.
Unified Workflow (UW) Tools
============================
-The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit currently includes rocoto, template, and configuration (config) tools, which are being incorporated into the Land DA workflow. Additional tools are under development. More details about UW tools can be found in the `uwtools `_ GitHub repository and in the :uw:`UW Documentation <>`.
+The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit includes rocoto, template, and configuration (config) tools, and additional tools and drivers are under development. The Land DA workflow makes use of the template tool to fill in user-specified values in the configuration file. It then uses the rocoto tool to generate a workflow XML file from the configuration file; other UW tools may be incorporated into the workflow in the future. More details about UW tools can be found in the `uwtools `_ GitHub repository and in the :uw:`UW Documentation <>`.
diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst
index 62cdac09..87e35a54 100644
--- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst
+++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst
@@ -4,11 +4,11 @@
Land DA Workflow (Hera/Orion/Hercules)
***************************************
-This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System using a Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component.
+This chapter provides instructions for building and running the Unified Forecast System (:term:`UFS`) Land DA System using a Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component and data atmosphere (DATM) component.
.. attention::
- These steps are designed for use on :ref:`Level 1 ` systems (i.e., Hera and Orion) and may require significant changes on other systems. It is recommended that users on other systems run the containerized version of Land DA. Users may reference :numref:`Chapter %s: Containerized Land DA Workflow ` for instructions.
+ These steps are designed for use on :ref:`Level 1 ` systems (e.g., Hera, Orion) and may require significant changes on other systems. It is recommended that users on other systems run the containerized version of Land DA. Users may reference :numref:`Chapter %s: Containerized Land DA Workflow ` for instructions.
.. _create-dir:
@@ -37,17 +37,17 @@ In this documentation, ``$LANDDAROOT`` is used, but users are welcome to choose
Get Code
***********
-Clone the Land DA workflow repository. To clone the ``develop`` branch, run:
+Clone the Land DA workflow repository. To clone the ``develop`` branch, run:
.. code-block:: console
git clone -b develop --recursive https://github.com/ufs-community/land-DA_workflow.git
-To clone the most recent release, run the same command with |branch| in place of ``develop``:
+To clone the most recent release, run the same command with |branch| in place of ``develop``:
.. code-block:: console
- git clone -b release/public-v1.2.0 --recursive https://github.com/ufs-community/land-DA_workflow.git
+ git clone -b release/public-v2.0.0 --recursive https://github.com/ufs-community/land-DA_workflow.git
.. _build-land-da:
@@ -66,6 +66,7 @@ Build the Land DA System
./app_build.sh
+ Users may need to press the ``Enter`` key to advance the build once the list of currently loaded modules appears.
If the code successfully compiles, the console output should end with:
.. code-block:: console
@@ -92,14 +93,9 @@ Load the Workflow Environment
To load the workflow environment, run:
-.. code-block:: console
-
- cd $LANDDAROOT/land-DA_workflow
- module use modulefiles
- module load wflow_
- conda activate land_da
+.. include:: ../doc-snippets/load-env.rst
-where ```` is ``hera`` or ``orion``. This activates the ``land_da`` conda environment, and the user typically sees ``(land_da)`` in front of the Terminal prompt at this point.
+This activates the ``land_da`` conda environment, and the user typically sees ``(land_da)`` in front of the Terminal prompt at this point.
.. _configure-expt:
@@ -111,28 +107,27 @@ Copy the experiment settings into ``land_analysis.yaml``:
.. code-block:: console
cd $LANDDAROOT/land-DA_workflow/parm
- cp land_analysis_.yaml land_analysis.yaml
+ cp parm_xml_.yaml parm_xml.yaml
-where ```` is ``hera`` or ``orion``.
+where ```` is ``hera``, ``orion``, or ``hercules``.
-Users will need to configure certain elements of their experiment in ``land_analysis.yaml``:
+Users will need to configure the ``account`` and ``exp_basedir`` variables in ``parm_xml.yaml``:
- * ``ACCOUNT:`` A valid account name. Hera, Orion, and most NOAA RDHPCS systems require a valid account name; other systems may not (in which case, any value will do).
- * ``EXP_BASEDIR:`` The full path to the directory where land-DA_workflow was cloned (i.e., ``$LANDDAROOT``)
- * ``cycledef/spec:`` Cycle specification
+ * ``account:`` A valid account name. Hera, Orion, Hercules, and most NOAA :term:`RDHPCS` systems require a valid account name; other systems may not (in which case, any value will do).
+ * ``exp_basedir:`` The full path to the directory where ``land-DA_workflow`` was cloned (i.e., ``$LANDDAROOT``). For example, if ``land-DA_workflow`` is located at ``/scratch2/NAGAPE/epic/User.Name/landda/land-DA_workflow`` on Hera, set ``exp_basedir`` to its parent directory: ``/scratch2/NAGAPE/epic/User.Name/landda``.
.. note::
- To determine an appropriate ``ACCOUNT`` field for Level 1 systems that use the Slurm job scheduler, run ``saccount_params``. On other systems, running ``groups`` will return a list of projects that the user has permissions for. Not all listed projects/groups have an HPC allocation, but those that do are potentially valid account names.
+ To determine an appropriate ``account`` field for Level 1 systems that use the Slurm job scheduler, run ``saccount_params``. On other systems, running ``groups`` will return a list of projects that the user has permissions for. Not all listed projects/groups have an HPC allocation, but those that do are potentially valid account names.
-Users may configure other elements of an experiment in ``land_analysis.yaml`` if desired. The ``land_analysis_*.yaml`` files contain reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in these files and the files they reference using information in Sections :numref:`%s `, :numref:`%s `, and :numref:`%s `.
+Users may configure other elements of an experiment in ``parm/templates/template.land_analysis.yaml`` if desired. For example, users may wish to alter the ``cycledef.spec`` to indicate a different start cycle, end cycle, and increment. The ``template.land_analysis.yaml`` file contains reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in this file using information from Sections :numref:`%s: Workflow Configuration Parameters `, :numref:`%s: I/O for the Noah-MP Model `, and :numref:`%s: I/O for JEDI DA `.
.. _GetData:
Data
------
-:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (i.e., Hera and Orion). These data locations are already included in the ``land_analysis_*.yaml`` files but are provided here for informational purposes.
+:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (e.g., Hera, Orion). These data locations are already linked to the Land DA System during the build but are provided here for informational purposes.
.. _Level1Data:
@@ -146,7 +141,7 @@ Data
* - Hercules & Orion
- /work/noaa/epic/UFS_Land-DA_Dev/inputs
-Users who have difficulty accessing the data on Hera or Orion may download it according to the instructions in :numref:`Section %s `. Its subdirectories are soft-linked to the ``fix`` directory of ``land-DA_workflow`` by the build script ``sorc/app_build.sh``.
+Users who have difficulty accessing the data on Hera, Orion, or Hercules may download it according to the instructions in :numref:`Section %s `. Its subdirectories are soft-linked to the ``land-DA_workflow/fix`` directory by the build script (``sorc/app_build.sh``); when downloading new data, it should be placed in or linked to the ``fix`` directory.
.. _generate-wflow:
@@ -157,9 +152,10 @@ Generate the workflow XML file with ``uwtools`` by running:
.. code-block:: console
+ uw template render --input-file templates/template.land_analysis.yaml --values-file parm_xml.yaml --output-file land_analysis.yaml
uw rocoto realize --input-file land_analysis.yaml --output-file land_analysis.xml
-If the command runs without problems, ``uwtools`` will output a "0 errors found" message similar to the following:
+If the commands run without issue, ``uwtools`` will output a "0 errors found" message similar to the following:
.. code-block:: console
@@ -193,10 +189,10 @@ Each Land DA experiment includes multiple tasks that must be run in order to sat
- Runs :term:`JEDI` and adds the increment to the surface data files
* - JLANDDA_POST_ANAL
- Transfers the JEDI result from the surface data files to the restart files
- * - JLANDDA_PLOT_STATS
- - Plots the JEDI result (scatter/histogram)
* - JLANDDA_FORECAST
- Runs the forecast model
+ * - JLANDDA_PLOT_STATS
+ - Plots the JEDI result (scatter/histogram) and the restart files
Users may run these tasks :ref:`using the Rocoto workflow manager `.
@@ -210,7 +206,7 @@ To run the experiment, users can automate job submission via :term:`crontab` or
Automated Run
---------------
-To automate task submission, users must be on a system where :term:`cron` is available. On Orion, cron is only available on the orion-login-1 node, so users will need to work on that node when running cron jobs on Orion.
+To automate task submission, users must be on a system where :term:`cron` is available. On Orion, cron is only available on the orion-login-1 node, and likewise on Hercules, it is only available on hercules-login-1, so users will need to work on those nodes when running cron jobs on Orion/Hercules.
.. code-block:: console
@@ -255,17 +251,17 @@ If ``rocotorun`` was successful, the ``rocotostat`` command will print a status
200001030000 pre_anal druby://10.184.3.62:41973 SUBMITTING - 1 0.0
200001030000 analysis - - - - -
200001030000 post_anal - - - - -
- 200001030000 plot_stats - - - - -
200001030000 forecast - - - - -
+ 200001030000 plot_stats - - - - -
=========================================================================================================
200001040000 prep_obs druby://10.184.3.62:41973 SUBMITTING - 1 0.0
200001040000 pre_anal - - - - -
200001040000 analysis - - - - -
200001040000 post_anal - - - - -
- 200001040000 plot_stats - - - - -
200001040000 forecast - - - - -
+ 200001040000 plot_stats - - - - -
-Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command (whether issued manually or via cron automation). For each task, a log file is generated. These files are stored in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``.
+Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command (whether issued manually or via cron automation). For each task, a log file is generated. These files are stored in ``$LANDDAROOT/ptmp/test/com/output/logs``.
The experiment has successfully completed when all tasks say SUCCEEDED under STATE. Other potential statuses are: QUEUED, SUBMITTING, RUNNING, and DEAD. Users may view the log files to determine why a task may have failed.
@@ -280,8 +276,8 @@ As the experiment progresses, it will generate a number of directories to hold i
.. code-block:: console
- $LANDDAROOT: Base directory
- ├── land-DA_workflow(): Home directory of the land DA workflow
+ $LANDDAROOT (): Base directory
+ ├── land-DA_workflow ( or ): Home directory of the land DA workflow
└── ptmp ()
└── test ( or )
└── com ()
@@ -299,7 +295,7 @@ As the experiment progresses, it will generate a number of directories to hold i
├── hofx: Directory containing the soft links to the results of the analysis task for plotting
└── DATA_RESTART: Directory containing the soft links to the restart files for the next cycles
-```` refers to the type of forcing data used (``gswp3`` or ``era5``). Each variable in parentheses and angle brackets (e.g., ``()``) is the name for the directory defined in the file ``land_analysis.yaml``. In the future, this directory structure will be further modified to meet the :nco:`NCO Implementation Standards<>`.
+Each variable in parentheses and angle brackets (e.g., ``()``) is the name for the directory defined in the file ``land_analysis.yaml`` (derived from ``template.land_analysis.yaml`` or ``parm_xml.yaml``) or in the NCO Implementation Standards. For example, the ```` variable is set to "test" (i.e., ``envir: "test"``) in ``template.land_analysis.yaml``. In the future, this directory structure will be further modified to meet the :nco:`NCO Implementation Standards<>`.
Check for the output files for each cycle in the experiment directory:
@@ -307,7 +303,7 @@ Check for the output files for each cycle in the experiment directory:
ls -l $LANDDAROOT/ptmp/test/com/landda//landda.YYYYMMDD
-where ``YYYYMMDD`` is the cycle date, and ```` is the model version (currently ``v1.2.1`` in the ``develop`` branch). The experiment should generate several restart files.
+where ``YYYYMMDD`` is the cycle date, and ```` is the model version (currently |latestr| in the ``develop`` branch). The experiment should generate several restart files.
.. _plotting:
@@ -332,3 +328,15 @@ The histogram plots OMA values on the x-axis and frequency density values on the
* - |logo1|
- |logo2|
+
+.. note::
+
+ There are many options for viewing plots, and instructions for this are highly machine dependent. Users should view the data transfer documentation for their system to secure copy files from a remote system (such as :term:`RDHPCS`) to their local system.
+ Another option is to download `Xming `_ (for Windows) or `XQuartz `_ (for Mac), use the ``-X`` option when connecting to a remote system via SSH, and run:
+
+ .. code-block:: console
+
+ module load imagemagick
+ display file_name.png
+
+ where ``file_name.png`` is the name of the file to display/view. Depending on the system, users may need to install imagemagick and/or adjust other settings (e.g., for X11 forwarding). Users should contact their machine administrator with any questions.
diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst
index 31015eb5..9fb9a867 100644
--- a/doc/source/BuildingRunningTesting/Container.rst
+++ b/doc/source/BuildingRunningTesting/Container.rst
@@ -4,9 +4,9 @@
Containerized Land DA Workflow
**********************************
-These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `_ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth systems models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (i.e., Hera, Orion).
+These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `_ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth system models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (e.g., Hera, Orion).
-This chapter provides instructions for building and running basic Land DA case for the UFS Land DA System using a Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component in a container.
+This chapter provides instructions for building and running the Unified Forecast System (:term:`UFS`) Land DA System sample case using a container. The sample case runs for Jan. 3-4, 2000 00z and uses :term:`GSWP3` data with the UFS Noah-MP land component and data atmosphere (:term:`DATM`) component.
.. attention::
@@ -19,22 +19,23 @@ Prerequisites
The containerized version of Land DA requires:
- * `Installation of Apptainer `_
- * At least 6 CPU cores
+ * `Installation of Apptainer `_ (or its predecessor, Singularity)
+ * At least 26 CPU cores (may be possible to run with 13, but this has not been tested)
* An **Intel** compiler and :term:`MPI` (available for `free here `_)
+ * The `Slurm `_ job scheduler
-Install Singularity/Apptainer
-===============================
+Install Apptainer
+==================
.. note::
As of November 2021, the Linux-supported version of Singularity has been `renamed `_ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see `compatibility details here `_.)
-To build and run Land DA using a Singularity/Apptainer container, first install the software according to the `Apptainer Installation Guide `_. This will include the installation of all dependencies.
+To build and run Land DA using a Apptainer container, first install the software according to the `Apptainer Installation Guide `_. This will include the installation of all dependencies.
.. attention::
- Docker containers can only be run with root privileges, and users generally do not have root privileges on :term:`HPCs `. However, a Singularity image may be built directly from a Docker image for use on the system.
+ Docker containers can only be run with root privileges, and users generally do not have root privileges on :term:`HPCs `. However, an Apptainer image may be built directly from a Docker image for use on the system.
.. _DownloadContainer:
@@ -99,13 +100,11 @@ On many NOAA :term:`RDHPCS`, a container named ``ubuntu22.04-intel-landda-releas
+-----------------+--------------------------------------------------------+
| Machine | File location |
+=================+========================================================+
- | Derecho | /glade/work/epicufsrt/contrib/containers |
- +-----------------+--------------------------------------------------------+
| Gaea | /gpfs/f5/epic/world-shared/containers |
+-----------------+--------------------------------------------------------+
| Hera | /scratch1/NCEPDEV/nems/role.epic/containers |
+-----------------+--------------------------------------------------------+
- | Jet | /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers |
+ | Jet | /mnt/lfs5/HFIP/hfv3gfs/role.epic/containers |
+-----------------+--------------------------------------------------------+
| NOAA Cloud | /contrib/EPIC/containers |
+-----------------+--------------------------------------------------------+
@@ -122,12 +121,12 @@ If users prefer, they may copy the container to their local working directory. F
.. code-block:: console
- cp /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers/ubuntu22.04-intel-landda-release-public-v2.0.0.img .
+ cp /mnt/lfs5/HFIP/hfv3gfs/role.epic/containers/ubuntu22.04-intel-landda-release-public-v2.0.0.img .
Other Systems
----------------
-On other systems, users can build the Singularity container from a public Docker :term:`container` image or download the ``ubuntu22.04-intel-landda-release-public-v2.0.0.img`` container from the `Land DA Data Bucket `_. Downloading may be faster depending on the download speed on the user's system. However, the container in the data bucket is the ``release/v2.0.0`` container rather than the updated ``develop`` branch container.
+On other systems, users can build the Singularity container from a public Docker :term:`container` image or download the ``ubuntu22.04-intel-landda-release-public-v2.0.0.img`` container from the `Land DA Data Bucket `_. Downloading may be faster depending on the download speed on the user's system. However, the container in the data bucket is the ``release/public-v2.0.0`` container rather than an updated ``develop`` branch container.
To download from the data bucket, users can run:
@@ -221,10 +220,6 @@ The ``setup_container.sh`` script should now be in the ``$LANDDAROOT`` directory
where ```` and ```` are replaced with a top-level directory on the local system and in the container, respectively. Additional directories can be bound by adding another ``-B /:/`` argument before the container location (``$img``). Note that if previous steps included a ``sudo`` command, ``sudo`` may be required in front of this command.
-.. attention::
-
- Be sure to bind the directory that contains the experiment data!
-
.. note::
Sometimes binding directories with different names can cause problems. In general, it is recommended that the local base directory and the container directory have the same name. For example, if the host system's top-level directory is ``/user1234``, the user may want to convert the ``.img`` file to a writable sandbox and create a ``user1234`` directory in the sandbox to bind to.
@@ -256,7 +251,7 @@ Because of a conda conflict between the container and the host system, it is bes
module load rocoto
-The ``setup_container.sh`` script creates the ``parm_xml.yaml`` from the ``parm_xml_singularity.yaml`` file. Update any relevant variables in this file (e.g. ``ACCOUNT`` or ``cycledef/spec``) before creating the Rocoto XML file.
+The ``setup_container.sh`` script creates the ``parm_xml.yaml`` from the ``parm_xml_singularity.yaml`` file. Update any relevant variables in this file (e.g., ``account`` or ``exp_basedir``) before creating the Rocoto XML file.
.. code-block:: console
@@ -265,14 +260,14 @@ The ``setup_container.sh`` script creates the ``parm_xml.yaml`` from the ``parm_
Save and close the file.
-Once everything looks good, run the uwtools scripts to create the Rocoto XML file:
+Once everything looks good, run the `uwtools `_ scripts to create the Rocoto XML file:
.. code-block:: console
../sorc/conda/envs/land_da/bin/uw template render --input-file templates/template.land_analysis.yaml --values-file parm_xml.yaml --output-file land_analysis.yaml
../sorc/conda/envs/land_da/bin/uw rocoto realize --input-file land_analysis.yaml --output-file land_analysis.xml
-A successful run of this command will output a “0 errors found” message.
+A successful run of these commands will output a “0 errors found” message.
.. _RunExptC:
@@ -285,7 +280,9 @@ To start the experiment, run:
rocotorun -w land_analysis.xml -d land_analysis.db
-See the :ref:`Workflow Overview ` section to learn more about the workflow process.
+Users will need to issue the ``rocotorun`` command multiple times. The tasks must be run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully.
+
+See the :ref:`Workflow Overview ` section to learn more about the steps in the workflow process.
.. _TrackProgress:
@@ -311,5 +308,5 @@ See the :ref:`Track Experiment Status ` section to learn more abo
Check Experiment Output
-------------------------
-Since this experiment in the container is the same experiment explained in the previous document section, it is suggested that users should see the :ref:`experiment output structure ` as well as the :ref:`plotting results ` to learn more about the expected experiment outputs.
+Since this experiment in the container is the same experiment explained in the previous document section, it is suggested that users view the :ref:`experiment output structure ` and :ref:`plotting results ` sections to learn more about the expected experiment output.
diff --git a/doc/source/BuildingRunningTesting/TestingLandDA.rst b/doc/source/BuildingRunningTesting/TestingLandDA.rst
index 4bb5d849..e31315a2 100644
--- a/doc/source/BuildingRunningTesting/TestingLandDA.rst
+++ b/doc/source/BuildingRunningTesting/TestingLandDA.rst
@@ -4,27 +4,30 @@
Testing the Land DA Workflow
************************************
-This chapter provides instructions for using the Land DA CTest suite. These steps are designed for use on :ref:`Level 1 ` systems (i.e., Hera and Orion) and may require significant changes on other systems.
+This chapter provides instructions for using the Land DA CTest suite. These steps are designed for use on :ref:`Level 1 ` systems (e.g., Hera and Orion) and may require significant changes on other systems.
.. attention::
- This chapter assumes that the user has already built the Land DA System according to the instructions in :numref:`Section %s ` and has access to the data provided in the most recent release. (See :numref:`Table %s ` for the locations of pre-staged data on NOAA :term:`RDHPCS` [i.e., Hera and Orion].)
+ This chapter assumes that the user has already built the Land DA System according to the instructions in :numref:`Section %s ` and has access to the data provided in the most recent release. (See :numref:`Table %s ` for the locations of pre-staged data on NOAA :term:`RDHPCS`.)
Process
*********
-From the working directory (``$LANDDAROOT``), navigate to ``build``. Then run:
+Method #1: Run from the ``build`` Directory
+============================================
+
+From the working directory (``$LANDDAROOT``), navigate to ``build`` and run:
.. code-block:: console
- salloc --ntasks 8 --exclusive --qos=debug --partition= --time=00:30:00 --account=
cd land-DA_workflow/sorc/build
+ salloc --ntasks 8 --exclusive --qos=debug --partition= --time=00:30:00 --account=
source ../../versions/build.ver_
module use ../../modulefiles
module load build__intel
ctest
-where ```` corresponds to the user's actual account name, ```` is a valid partition on the platform of choice (e.g., ``debug`` or ``orion``), and ```` is ``hera`` or ``orion``.
+where ```` corresponds to the user's actual account name, ```` is a valid partition on the platform of choice (e.g., ``debug`` or ``orion``), and ```` is ``hera``, ``orion``, or ``hercules``.
This will submit an interactive job, load the appropriate modulefiles, and run the CTests.
@@ -46,10 +49,46 @@ If the tests are successful, a message will be printed to the console. For examp
Total Test time (real) = 187.29 sec
+Method #2: Run from the ``test`` Directory
+============================================
+
+.. note::
+
+ This method works only on Hera, Orion, and Hercules and will run even if the Land DA System has not been built yet.
+
+From the working directory (``$LANDDAROOT``), navigate to ``test`` and run:
+
+.. code-block:: console
+
+ cd land-DA_workflow/sorc/test
+ ./run_ctest_platform.sh
+
+The CTest working directory will appear in ``build/test``, and the log file can be found in ``build/Testing/Temporary``.
+
+Method #3: Run Tests Using a Container
+============================================
+
+.. attention::
+
+ The container CTest functionality has been tested in Jenkins. It should be able to run on a sufficiently large cloud instance. However, it is considered unsupported functionality because it has not been thoroughly tested on the cloud for use by the public.
+
+For containers, the CTest functionality is wrapped in a Dockerfile. Therefore, users will need to build the Dockerfile to run the CTests. Since the Land DA container is quite large, this process can take a long time --- potentially hours. In the future, the development team hopes to simplify and shorten this process.
+
+.. code-block:: console
+
+ git clone -b release/public-v2.0.0 --recursive https://github.com/ufs-community/land-DA_workflow.git
+ cd land-DA_workflow/sorc/test/ci
+ sudo systemctl start docker
+ sudo docker build -f Dockerfile -t dockerfile-ci-ctest:release .
+
+.. note::
+
+ ``sudo`` may not be required in front of the last two commands on all systems.
+
Tests
*******
-The CTests test the operability of four major elements of the Land DA System: ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container.
+The CTests test the operability of four major elements of the Land DA System: ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file.
.. list-table:: *Land DA CTests*
:widths: 20 50
@@ -58,10 +97,15 @@ The CTests test the operability of four major elements of the Land DA System: ``
* - Test
- Description
* - ``test_create_ens``
- - Tests creation of a pseudo-ensemble for use in LETKF-OI.
+ - Tests creation of a pseudo-ensemble for use in :term:`LETKF-OI`.
* - ``test_letkfoi_snowda``
- Tests the use of LETKF-OI to assimilate snow data.
* - ``test_apply_jediincr``
- Tests the ability to add a JEDI increment.
* - ``test_ufs_datm_land``
- Tests proper functioning of the UFS land model (``ufs-datm-lnd``)
+
+.. note::
+
+ There are plans to add workflow end-to-end (WE2E) tests to the Land DA System. Currently, when ``WE2E_TEST: "YES"``, this functionality checks that the output from the Jan. 3-4, 2000 sample case is within the tolerance set (via the ``WE2E_ATOL`` variable) at the end of the three main tasks --- *analysis*, *forecast*, and *post_anal*. The results are logged by default in ``we2e.log``. In the future, this functionality will be expanded to encompass a full range of WE2E tests.
+
diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst
index 8ce0284c..5440bac1 100644
--- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst
+++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst
@@ -4,9 +4,9 @@
Available Workflow Configuration Parameters
***************************************************
-To run the Land DA System, users must create an experiment configuration file (named ``land_analysis.yaml`` by default). This file contains experiment-specific information, such as forecast/cycle dates, grid and physics suite choices, data directories, and other relevant settings. To help the user, two sample ``land_analysis_.yaml`` configuration files have been included in the ``parm`` directory for use on Hera and Orion. They contain reasonable experiment default values that work on those machines. The content of these files can be copied into ``land_analysis.yaml`` and used as the starting point from which to generate a variety of experiment configurations for Land DA.
+To run the Land DA System, users must create an experiment configuration file (named ``land_analysis.yaml`` by default) that combines the default values in ``template.land_analysis.yaml`` with user-specified values from ``parm_xml.yaml``. Currently, ``template.land_analysis.yaml`` contains most of the experiment-specific information, such as forecast/cycle dates, while ``parm_xml.yaml`` contains user/machine-specific settings, such as data directory locations. To help the user, sample ``parm_xml_.yaml`` configuration files have been included in the ``parm`` directory for use on Hera, Orion, and Hercules. The ``template.land_analysis.yaml`` contains reasonable experiment default values that work on those machines. The contents of these files can be used as the starting point from which to generate a variety of experiment configurations for Land DA.
-The following is a list of the parameters in the ``land_analysis_.yaml`` files. For each parameter, the default value and a brief description are provided.
+The following is a list of the parameters included in the ``land_analysis.yaml`` file (and derived from ``template.land_analysis.yaml`` and ``parm_xml.yaml``). For each parameter, the default value and a brief description are provided.
.. _wf-attributes:
@@ -66,7 +66,7 @@ Cycling information is defined in the ``cycledef:`` section under ``workflow:``.
Workflow Entities
===================
-Entities are constants that can be referred to throughout the workflow using the ampersand (``&``) prefix and semicolon (``;``) suffix (e.g., ``&MACHINE;``) to avoid defining the same constants repetitively in each workflow task. For example, in ``land_analysis_orion.yaml``, the following entities are defined:
+Entities are constants that can be referred to throughout the workflow using the ampersand (``&``) prefix and semicolon (``;``) suffix (e.g., ``&MACHINE;``) to avoid defining the same constants repetitively in each workflow task. For example, in a ``land_analysis.yaml`` created on Orion, the following entities are defined:
.. code-block:: console
@@ -80,18 +80,29 @@ Entities are constants that can be referred to throughout the workflow using the
WARMSTART_DIR: "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART"
ATMOS_FORC: "gswp3"
RES: "96"
- FCSTHR: "24"
NPROCS_ANALYSIS: "6"
- NPROCS_FORECAST: "7"
+ FCSTHR: "24"
+ DT_ATMOS: "900"
+ DT_RUNSEQ: "3600"
+ NPROCS_FORECAST: "26"
+ NPROCS_FORECAST_ATM: "12"
+ NPROCS_FORECAST_LND: "12"
+ LND_LAYOUT_X: "1"
+ LND_LAYOUT_Y: "2"
+ LND_OUTPUT_FREQ_SEC: "21600"
+ NNODES_FORECAST: "1"
+ NPROCS_PER_NODE: "26"
OBSDIR: ""
OBSDIR_SUBDIR: ""
OBS_TYPES: "GHCN"
DAtype: "letkfoi_snow"
- SNOWDEPTHVAR: "snwdph"
TSTUB: "oro_C96.mx100"
+ WE2E_TEST: "YES"
+ WE2E_ATOL: "1e-7"
+ WE2E_LOG_FN: "we2e.log"
NET: "landda"
envir: "test"
- model_ver: "v1.2.1"
+ model_ver: "v2.0.0"
RUN: "landda"
HOMElandda: "&EXP_BASEDIR;/land-DA_workflow"
PTMP: "&EXP_BASEDIR;/ptmp"
@@ -107,27 +118,23 @@ Entities are constants that can be referred to throughout the workflow using the
DATADEP_FILE3: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc"
DATADEP_FILE4: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc"
-.. note::
-
- When two defaults are listed, one is the default on Hera, and one is the default on Orion, depending on the ``land_analysis_.yaml`` file used. The default on Hera is listed first, followed by the default on Orion.
-
-``MACHINE:`` (Default: "hera" or "orion")
- The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed in :numref:`Section %s `. Valid values: ``"hera"`` | ``"orion"``
+``MACHINE:`` (Default: "{{ machine }}")
+ The machine (a.k.a. platform or system) on which the workflow will run. The actual value is derived from the ``parm_xml_.yaml`` file. Currently supported platforms are listed in :numref:`Section %s `. Valid values: ``"hera"`` | ``"orion"`` | ``"hercules"``
``SCHED:`` (Default: "slurm")
The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Valid values: ``"slurm"``. Other options may work with a container but have not been tested: ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"``
-``ACCOUNT:`` (Default: "epic")
- An account where users can charge their compute resources on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names.
+``ACCOUNT:`` (Default: "{{ account }}")
+ An account where users can charge their compute resources on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. The actual value used in the workflow is derived from the ``parm_xml_.yaml`` file.
-``EXP_BASEDIR:`` (Default: "/scratch2/NAGAPE/epic/{USER}/landda_test" or "/work/noaa/epic/{USER}/landda_test")
- The full path to the parent directory of ``land-DA_workflow`` (i.e., ``$LANDDAROOT`` in the documentation).
+``EXP_BASEDIR:`` (Default: "{{ exp_basedir }}")
+ The full path to the parent directory of ``land-DA_workflow`` (i.e., ``$LANDDAROOT`` in the documentation). The actual value is derived from the ``parm_xml_.yaml`` file.
-``JEDI_INSTALL:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7" or "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6")
- The path to the JEDI |skylabv| installation.
+``JEDI_INSTALL:`` (Default: "{{ jedi_install }}")
+ The path to the JEDI |skylabv| installation. Derived from the ``parm_xml_.yaml`` file. The actual value is derived from the ``parm_xml_.yaml`` file.
-``WARMSTART_DIR:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" or "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART")
- The path to restart files for a warmstart experiment.
+``WARMSTART_DIR:`` (Default: "{{ warmstart_dir }}")
+ The path to restart files for a warmstart experiment. The actual value is derived from the ``parm_xml_.yaml`` file.
``ATMOS_FORC:`` (Default: "gswp3")
Type of atmospheric forcing data used. Valid values: ``"gswp3"``
@@ -141,11 +148,38 @@ Entities are constants that can be referred to throughout the workflow using the
``NPROCS_ANALYSIS:`` (Default: "6")
Number of processors for the analysis task.
-``NPROCS_FORECAST:`` (Default: "7")
- Number of processors for the forecast task.
+``DT_ATMOS:`` (Default: "900")
+ The main integration time step of the atmospheric component of the UFS Weather Model (in seconds). This is the time step for the outermost atmospheric model loop and must be a positive integer value. It corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, tracer transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `_ for details.)
+
+``DT_RUNSEQ:`` (Default: "6")
+ Time interval of run sequence (coupling interval) between the model components of the UFS Weather Model (in seconds).
+
+``NPROCS_FORECAST:`` (Default: "26")
+ Total number of processes for the FORECAST task.
+
+``NPROCS_FORECAST_ATM:`` (Default: "12")
+ Number of processes for the atmospheric model component (DATM) in the FORECAST task.
+
+``NPROCS_FORECAST_LND:`` (Default: "12")
+ Number of processes for the land model component (Noah-MP) in the FORECAST task.
+
+``LND_LAYOUT_X:`` (Default: "1")
+ Number of processes in the x direction per tile for the land model component.
+``LND_LAYOUT_Y:`` (Default: "2")
+ Number of processes in the y direction per tile for the land model component.
+
+``LND_OUTPUT_FREQ_SEC:`` (Default: "21600")
+ Output frequency of the land model component (in seconds).
+
+``NNODES_FORECAST:`` (Default: "1")
+ Number of nodes for the FORECAST task.
+
+``NPROCS_PER_NODE:`` (Default: "26")
+ Number of processes per node for the FORECAST task.
+
``OBSDIR:`` (Default: "")
- The path to the directory where DA fix files are located. In ``scripts/exlandda_prep_obs.sh``, this value is set to ``${FIXlandda}/DA`` unless the user specifies a different path in ``land_analysis.yaml``.
+ The path to the directory where DA fix files are located. In ``scripts/exlandda_prep_obs.sh``, this value is set to ``${FIXlandda}/DA`` unless the user specifies a different path in ``template.land_analysis.yaml``.
``OBSDIR_SUBDIR:`` (Default: "")
The path to the directories where different types of fix data (e.g., ERA5, GSWP3, GTS, NOAH-MP) are located. In ``scripts/exlandda_prep_obs.sh``, this value is set based on the type(s) of data requested. The user may choose to set a different value.
@@ -154,14 +188,20 @@ Entities are constants that can be referred to throughout the workflow using the
Specifies the observation type. Format is "Obs1" "Obs2". Currently, only GHCN observation data is available.
``DAtype:`` (Default: "letkfoi_snow")
- Type of data assimilation. Valid values: ``letkfoi_snow``. Currently, Land DA only performs snow DA using the LETKF-OI algorithm. As the application expands, more options may be added.
-
-``SNOWDEPTHVAR:`` (Default: "snwdph")
- Placeholder --- currently not used in workflow. This value is currently hard-coded into ``scripts/exlandda_analysis.sh``.
+ Type of data assimilation. Valid values: ``letkfoi_snow``. Currently, Land DA only performs snow DA. As the application expands, more options may be added.
``TSTUB:`` (Default: "oro_C96.mx100")
Specifies the file stub/name for orography files in ``TPATH``. This file stub is named ``oro_C${RES}`` for atmosphere-only orography files and ``oro_C{RES}.mx100`` for atmosphere and ocean orography files. When Land DA is compiled with ``sorc/app_build.sh``, the subdirectories of the fix files should be linked into the ``fix`` directory, and orography files can be found in ``fix/FV3_fix_tiled/C96``.
+``WE2E_TEST:`` (Default: "{{ we2e_test }}"/"NO")
+ Flag to turn on the workflow end-to-end (WE2E) test. When WE2E_TEST="YES", the result files from the experiment are compared to the test baseline files, located in ``fix/test_base/we2e_com``. If the results are within the tolerance set (via ``WE2E_ATOL``) at the end of the three main tasks --- ``analysis``, ``forecast``, and ``post_anal`` --- then the experiment passes. The actual value is derived from the ``parm_xml_.yaml`` file but preset to "NO" in that file. Valid values: ``"YES"`` | ``"NO"``
+
+``WE2E_ATOL:`` (Default: "1e-7")
+ Tolerance of the WE2E test
+
+``WE2E_LOG_FN:`` (Default: "we2e.log")
+ Name of the WE2E test log file
+
``DATADEP_FILE1:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc")
File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted.
@@ -196,7 +236,7 @@ Standard environment variables are defined in the NCEP Central Operations :nco:`
``NET:`` (Default: "landda")
Model name (first level of ``com`` directory structure)
-``model_ver:`` (Default: "v1.2.1")
+``model_ver:`` (Default: "v2.0.0")
Version number of package in three digits (e.g., v#.#.#); second level of ``com`` directory
``RUN:`` (Default: "landda")
@@ -262,9 +302,9 @@ The following subsections explain any variables that have not already been expla
Sample Task: Analysis Task (``task_analysis``)
------------------------------------------------
-This section walks users through the structure of the analysis task (``task_analysis``) to explain how configuration information is provided in the ``land_analysis_.yaml`` file for each task. Since each task has a similar structure, common information is explained in this section. Variables unique to a particular task are defined in their respective ``task_`` sections below.
+This section walks users through the structure of the analysis task (``task_analysis``) to explain how configuration information is provided to the ``land_analysis.yaml`` file for each task. Since each task has a similar structure, common information is explained in this section. Variables unique to a particular task are defined in their respective ``task_`` sections below.
-Parameters for a particular task are set in the ``workflow.tasks.task_:`` section of the ``land_analysis_.yaml`` file. For example, settings for the analysis task are provided in the ``task_analysis:`` section of ``land_analysis_.yaml``. The following is an excerpt of the ``task_analysis:`` section of ``land_analysis_.yaml``:
+Parameters for a particular task are set in the ``workflow.tasks.task_:`` section of the ``template.land_analysis.yaml`` file. For example, settings for the analysis task are provided in the ``task_analysis:`` section of ``template.land_analysis.yaml``. The following is an excerpt of the ``task_analysis:`` section of ``template.land_analysis.yaml``:
.. code-block:: console
@@ -282,6 +322,10 @@ Parameters for a particular task are set in the ``workflow.tasks.task_:``
EXP_NAME: "&EXP_NAME;"
RES: "&RES;"
TSTUB: "&TSTUB;"
+ WE2E_TEST: "&WE2E_TEST;"
+ WE2E_ATOL: "&WE2E_ATOL;"
+ WE2E_LOG_FN: "&WE2E_LOG_FN;"
+ LOGDIR: "&LOGDIR;
model_ver: "&model_ver;"
HOMElandda: "&HOMElandda;"
COMROOT: "&COMROOT;"
@@ -290,7 +334,6 @@ Parameters for a particular task are set in the ``workflow.tasks.task_:``
PDY: "&PDY;"
cyc: "&cyc;"
DAtype: "&DAtype;"
- SNOWDEPTHVAR: "&SNOWDEPTHVAR;"
NPROCS_ANALYSIS: "&NPROCS_ANALYSIS;"
JEDI_INSTALL: "&JEDI_INSTALL;"
account: "&ACCOUNT;"
@@ -315,8 +358,6 @@ The ``attrs:`` section for each task includes the ``cycledefs:`` attribute and t
``cycledefs:`` (Default: cycled)
A comma-separated list of ``cycledef:`` group names. A task with a ``cycledefs:`` group ID will be run only if its group ID matches one of the workflow's ``cycledef:`` group IDs.
-.. COMMENT: Clarify!
-
``maxtries:`` (Default: 2)
The maximum number of times Rocoto can resumbit a failed task.
@@ -444,14 +485,14 @@ Other tasks may list data or time dependencies. For example, the pre-analysis ta
age: 5
value: "&DATADEP_FILE4;"
-For details on the dependency details (e.g., ``attrs:``, ``age:``, ``value:`` tags), view the authoritative :rocoto:`Rocoto documentation <>`.
+For details on dependencies (e.g., ``attrs:``, ``age:``, ``value:`` tags), view the authoritative :rocoto:`Rocoto documentation <>`.
.. _prep-obs:
Observation Preparation Task (``task_prep_obs``)
--------------------------------------------------
-Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
+Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``template.land_analysis.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
.. code-block:: console
@@ -489,7 +530,7 @@ Parameters for the observation preparation task are set in the ``task_prep_obs:`
Pre-Analysis Task (``task_pre_anal``)
---------------------------------------
-Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
+Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``template.land_analysis.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
.. code-block:: console
@@ -546,14 +587,14 @@ Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section o
Analysis Task (``task_analysis``)
-----------------------------------
-Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. Most are the same as the defaults set in the :ref:`Workflow Entities ` section. The ``task_analysis:`` task is explained fully in the :ref:`Sample Task ` section.
+Parameters for the analysis task are set in the ``task_analysis:`` section of the ``template.land_analysis.yaml`` file. Most are the same as the defaults set in the :ref:`Workflow Entities ` section. The ``task_analysis:`` task is explained fully in the :ref:`Sample Task ` section.
.. _post-analysis:
Post-Analysis Task (``task_post_anal``)
-----------------------------------------
-Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
+Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``template.land_analysis.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
.. code-block:: console
@@ -595,7 +636,7 @@ Parameters for the post analysis task are set in the ``task_post_anal:`` section
Plotting Task (``task_plot_stats``)
-------------------------------------
-Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
+Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``template.land_analysis.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
.. code-block:: console
@@ -634,7 +675,7 @@ Parameters for the plotting task are set in the ``task_plot_stats:`` section of
Forecast Task (``task_forecast``)
----------------------------------
-Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
+Parameters for the forecast task are set in the ``task_forecast:`` section of the ``template.land_analysis.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ.
.. code-block:: console
@@ -662,16 +703,24 @@ Parameters for the forecast task are set in the ``task_forecast:`` section of th
cyc: "&cyc;"
DAtype: "&DAtype;"
FCSTHR: "&FCSTHR;"
- NPROCS_FORECAST: "&NPROCS_FORECAST;"
+ DT_ATMOS: "&DT_ATMOS;"
+ DT_RUNSEQ: "&DT_RUNSEQ;"
+ NPROCS_FORECAST: "&NPROCS_FORECAST;"
+ NPROCS_FORECAST_ATM: "&NPROCS_FORECAST_ATM;"
+ NPROCS_FORECAST_LND: "&NPROCS_FORECAST_LND;"
+ LND_LAYOUT_X: "&LND_LAYOUT_X;"
+ LND_LAYOUT_Y: "&LND_LAYOUT_Y;"
+ LND_OUTPUT_FREQ_SEC: "&LND_OUTPUT_FREQ_SEC;"
+ NNODES_FORECAST: "&NNODES_FORECAST;"
+ NPROCS_PER_NODE: "&NPROCS_PER_NODE;"
account: "&ACCOUNT;"
command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "forecast" "&HOMElandda;" "&MACHINE;"'
jobname: forecast
- nodes: "1:ppn=&NPROCS_FORECAST;"
- walltime: 01:00:00
+ nodes: "1:ppn=&NPROCS_FORECAST;:ppn=&NPROCS_PER_NODE;"
+ walltime: 00:30:00
queue: batch
join: "&LOGDIR;/forecast&LOGFN_SUFFIX;"
dependency:
taskdep:
attrs:
task: post_anal
-
diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst
index fa47c5f8..c15c8c10 100644
--- a/doc/source/CustomizingTheWorkflow/DASystem.rst
+++ b/doc/source/CustomizingTheWorkflow/DASystem.rst
@@ -4,7 +4,7 @@
Input/Output Files for the JEDI DA System
******************************************
-This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (|skylabv|) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (:cite:t:`HuntEtAl2007`, 2007).
+This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (|skylabv|) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter (LETKF) algorithm with pseudo-ensemble error covariance.
Joint Effort for Data Assimilation Integration (JEDI)
********************************************************
@@ -35,7 +35,7 @@ JEDI Configuration Files & Parameters
The DA experiment integrates information from several YAML configuration files, which contain certain fundamental components such as geometry, time window, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information.
-In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is the job ID assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file.
+In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration files (``parm_xml.yaml`` and ``template.land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is the job ID assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file.
.. code-block:: yaml
@@ -155,7 +155,7 @@ In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information
.. note::
- Any default values indicated in the sections below are the defaults set in ``letkfoi_snow.yaml``, ``GHCN.yaml``, or ``land_analysis.yaml``.
+ Any default values indicated in the sections below are the defaults set in ``letkfoi_snow.yaml``, ``GHCN.yaml``, ``parm_xml.yaml``, or ``template.land_analysis.yaml``.
Geometry
^^^^^^^^^^^
@@ -568,17 +568,17 @@ The grid description files appear in :numref:`Table %s ` below:
Observation Data
====================
-Observation data from 2000 and 2019 are provided in NetCDF format for the |latestr| release. Instructions for downloading the data are provided in :numref:`Section %s `, and instructions for accessing the data on :ref:`Level 1 Systems ` are provided in :numref:`Section %s `. Currently, data is taken from the `Global Historical Climatology Network `_ (GHCN), but eventually, data from the U.S. National Ice Center (USNIC) Interactive Multisensor Snow and Ice Mapping System (`IMS `_) will also be available for use.
+Observation data from 2000 are provided in NetCDF format for the |latestr| release. Instructions for downloading the data are provided in :numref:`Section %s `, and instructions for accessing the data on :ref:`Level 1 Systems ` are provided in :numref:`Section %s `. Currently, data is taken from the `Global Historical Climatology Network `_ (GHCN), but eventually, data from the U.S. National Ice Center (USNIC) Interactive Multisensor Snow and Ice Mapping System (`IMS `_) will also be available for use.
Users can view file header information and notes for NetCDF formatted files using the instructions in :numref:`Section %s `. For example, on Orion, users can run:
.. code-block:: console
# Load modules:
- module load netcdf-c/4.9.2
- ncdump -h /work/noaa/epic/UFS_Land-DA_Dev/inputs/DA/snow_depth/GHCN/data_proc/v3/2019/ghcn_snwd_ioda_20191221.nc
+ module load netcdf/4.7.0
+ ncdump -h /work/noaa/epic/UFS_Land-DA_Dev/inputs/DA/snow_depth/GHCN/data_proc/v3/2000/ghcn_snwd_ioda_20000103.nc
-to see the header contents of the 2019-12-21 GHCN snow depth file. Users may need to modify the module load command and the file path to reflect module versions/file paths that are available on their system.
+to see the header contents of the 2000-01-03 GHCN snow depth file. Users may need to modify the module load command and the file path to reflect module versions/file paths that are available on their system.
Observation Types
--------------------
@@ -600,19 +600,20 @@ The IODA-formatted GHCN files are available in the ``inputs/DA/snow_depth/GHCN/d
.. code-block:: console
- netcdf ghcn_snwd_ioda_20191221 {
+
+ netcdf ghcn_snwd_ioda_20000103 {
dimensions:
- Location = UNLIMITED ; // (10466 currently) ;
+ Location = UNLIMITED ; // (10423 currently)
variables:
int64 Location(Location) ;
Location:suggested_chunk_dim = 10000LL ;
// global attributes:
- string :_ioda_layout = "ObsGroup" ;
- :_ioda_layout_version = 0 ;
- string :converter = "ghcn_snod2ioda.py" ;
- string :date_time_string = "2000-01-01T18:00:00Z" ;
- :nlocs = 10466 ;
+ string :_ioda_layout = "ObsGroup" ;
+ :_ioda_layout_version = 0 ;
+ string :converter = "ghcn_snod2ioda.py" ;
+ string :date_time_string = "2000-01-03T18:00:00Z" ;
+ :nlocs = 10423 ;
group: MetaData {
variables:
@@ -639,7 +640,7 @@ The IODA-formatted GHCN files are available in the ``inputs/DA/snow_depth/GHCN/d
string totalSnowDepth:coordinates = "longitude latitude" ;
string totalSnowDepth:units = "mm" ;
} // group ObsError
-
+
group: ObsValue {
variables:
float totalSnowDepth(Location) ;
@@ -664,9 +665,9 @@ Observation Location and Processing
GHCN
^^^^^^
-GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `_. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/${YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day.
+GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `_. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/v3/${YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day.
-In each experiment, the ``land_analysis_*.yaml`` file sets the type of observation file (e.g., ``OBS_TYPES: "GHCN"``). Before assimilation, if "GHCN" was specified as the observation type, the ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc`` file corresponding to the specified cycle date is copied to the run directory (usually ``$LANDDAROOT/ptmp/test/com/landda/$model_ver/landda.$PDY$cyc/obs`` by default --- see :numref:`Section %s ` for more on these variables) with a naming-convention change (i.e., ``GHCN_${YYYY}${MM}${DD}${HH}.nc``).
+In each experiment, the ``template.land_analysis.yaml`` file sets the type of observation file (e.g., ``OBS_TYPES: "GHCN"``). Before assimilation, if "GHCN" was specified as the observation type, the ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc`` file corresponding to the specified cycle date is copied to the run directory (usually ``$LANDDAROOT/ptmp/test/com/landda/$model_ver/landda.$PDY$cyc/obs`` by default --- see :numref:`Section %s ` for more on these variables) with a naming-convention change (i.e., ``GHCN_${YYYY}${MM}${DD}${HH}.nc``).
Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, the observations are combined into a single ``letkf_land.yaml`` file, which is a concatenation of ``letkfoi_snow.yaml`` and ``GHCN.yaml`` (see :numref:`Section %s ` for further explanation). The GHCN-specific observation filters, domain checks, and quality control parameters from ``GHCN.yaml`` ensure that only snow depth observations which meet specific criteria are assimilated (the rest are rejected). View the contents of ``GHCN.yaml`` are :github:`on GitHub `.
@@ -846,13 +847,3 @@ To restart the Land DA System successfully after land model execution, all param
| snow_level_liquid | liquid content of snow levels | "mm" |
+--------------------------+-----------------------------------+-----------------------+
-The restart files also include one text file, ``${FILEDATE}.coupler.res``, which contains metadata for the restart.
-
-Example of ``${FILEDATE}.coupler.res``:
-
-.. code-block:: console
-
- 2 (Calendar: no_calendar=0, thirty_day_months=1, julian=2, gregorian=3, noleap=4)
- 2019 12 22 0 0 0 Model start time: year, month, day, hour, minute, second
- 2019 12 22 0 0 0 Current model time: year, month, day, hour, minute, second
-
diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst
index 4d27ad01..18fd7988 100644
--- a/doc/source/CustomizingTheWorkflow/Model.rst
+++ b/doc/source/CustomizingTheWorkflow/Model.rst
@@ -4,7 +4,7 @@
Input/Output Files for the Noah-MP Model
*****************************************
-This chapter provides practical information on input files and parameters for the Noah-MP Land Surface Model (LSM) and its Vector-to-Tile Converter component.
+This chapter provides practical information on input files and parameters for the Noah-MP Land Surface Model (LSM).
For background information on the Noah-MP LSM, see :numref:`Section %s ` of the Introduction.
.. _InputFiles:
@@ -13,9 +13,9 @@ Input Files
**************
The UFS land model requires multiple input files to run, including static datasets (fix files containing climatological information, terrain, and land use data), initial conditions files, and forcing files.
-Users may reference the `Community Noah-MP Land Surface Modeling System Technical Description Version 5.0 `_ (2023) and the `Community Noah-MP User's Guide `_ (2011) for a detailed technical description of certain elements of the Noah-MP model.
+Users may reference the `Community Noah-MP Land Surface Modeling System Technical Description Version 5.0 `_ (2023) for a detailed technical description of certain elements of the Noah-MP model.
-In both the land component and land driver implementations of Noah-MP, static file(s) and initial conditions file(s) specify model parameters.
+In Noah-MP, the static file(s) and initial conditions file(s) specify model parameters.
These files are publicly available in the `Land DA data bucket `_.
Users can download the data and untar the file via the command line:
@@ -23,15 +23,15 @@ Users can download the data and untar the file via the command line:
.. code-block:: console
- wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/develop-20240501/Landda_develop_data.tar.gz
- tar xvfz Landda_develop_data.tar.gz
+ wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/develop-20241024/inputs.tar.gz
+ tar xvfz inputs.tar.gz
For data specific to the latest release (|latestr|), users can run:
.. code-block:: console
- wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/current_land_da_release_data/v1.2.0/Landdav1.2.0_input_data.tar.gz
- tar xvfz Landdav1.2.0_input_data.tar.gz
+ wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/current_land_da_release_data/v2.0.0/LandDAInputDatav2.0.0.tar.gz
+ tar xvfz LandDAInputDatav2.0.0.tar.gz
These files and their parameters are described in the following subsections.
@@ -43,11 +43,13 @@ Viewing netCDF Files
Users can view file information, variables, and notes for NetCDF files using the ``ncdump`` module. On Level 1 platforms, users can load the Land DA environment from ``land-DA_workflow`` as described in :numref:`Section %s `.
+.. include:: ../doc-snippets/load-env.rst
+
Then, users can run ``ncdump -h path/to/filename.nc``, where ``path/to/filename.nc`` is replaced with the path to the file. For example, on Orion, users might run:
.. code-block:: console
- module load netcdf-c/4.9.2
+ module load netcdf/4.7.0
ncdump -h /work/noaa/epic/UFS_Land-DA_Dev/inputs/NOAHMP_IC/ufs-land_C96_init_fields.tile1.nc
@@ -55,7 +57,7 @@ On other systems, users can load a compiler, MPI, and NetCDF modules before runn
.. code-block:: console
- module load intel/2022.1.2 impi/2022.1.2 netcdf-c/4.9.2
+ module load intel/2022.1.2 impi/2022.1.2 netcdf/4.7.0
ncdump -h /path/to/inputs/NOAHMP_IC/ufs-land_C96_init_fields.tile1.nc
Users may need to modify the ``module load`` command to reflect modules that are available on their system.
@@ -65,14 +67,10 @@ Users may need to modify the ``module load`` command to reflect modules that are
Input Files for the ``DATM`` + ``LND`` Configuration with GSWP3 data
======================================================================
-With the integration of the UFS Noah-MP land component into the Land DA System in the v1.2.0 release, model forcing options have been enhanced so that users can run the UFS land component (:term:`LND`) with the data atmosphere component (:term:`DATM`). Updates provide a new analysis option on the cubed-sphere native grid using :term:`GSWP3` forcing data to run a cycled experiment for 2000-01-03 to 2000-01-04. An artificial GHCN snow depth observation is provided for data assimilation (see :numref:`Section %s ` for more on GHCN files). The GHCN observations will be extended in the near future.
+With the integration of the UFS Noah-MP land component into the Land DA System, model forcing options have been enhanced so that users can run the UFS land component (:term:`LND`) with the data atmosphere component (:term:`DATM`). Updates provide a new analysis option on the cubed-sphere native grid using :term:`GSWP3` forcing data to run a cycled experiment for 2000-01-03 to 2000-01-04. An artificial GHCN snow depth observation is provided for data assimilation (see :numref:`Section %s ` for more on GHCN files). The GHCN observations will be extended in the near future.
On Level 1 platforms, the requisite data are pre-staged at the locations listed in :numref:`Section %s `. The data are also publicly available via the `Land DA Data Bucket `_.
-.. attention::
-
- The DATM + LND option is only supported on Level 1 systems (i.e., Hera and Orion). It is not tested or supported using a container except on Hera and Orion.
-
Forcing Files
---------------
diff --git a/doc/source/Reference/FAQ.rst b/doc/source/Reference/FAQ.rst
index 3af435b2..53ed2898 100644
--- a/doc/source/Reference/FAQ.rst
+++ b/doc/source/Reference/FAQ.rst
@@ -8,6 +8,24 @@ Frequently Asked Questions (FAQ)
:depth: 2
:local:
+.. _DeadTask:
+
+My tasks went DEAD. Why might this be?
+========================================
+
+The most common reason for the first few tasks to go DEAD is an improper path in the ``parm_xml.yaml`` configuration file.
+In particular, ``exp_basedir`` must be set to the directory above ``land-DA_workflow``. For example, if ``land-DA_workflow`` resides at ``Users/Jane.Doe/landda/land-DA_workflow``, then ``exp_basedir`` must be set to ``Users/Jane.Doe/landda``. After correcting ``parm_xml.yaml``, users will need to regenerate the workflow XML by running:
+
+.. code-block:: console
+
+ uw template render --input-file templates/template.land_analysis.yaml --values-file parm_xml.yaml --output-file land_analysis.yaml
+ uw rocoto realize --input-file land_analysis.yaml --output-file land_analysis.xml
+
+Then, rewind the DEAD tasks as described :ref:`below ` using ``rocotorewind``, and use ``rocotorun``/``rocotostat`` to advance/check on the workflow (see :numref:`Section %s ` for how to do this).
+
+If the first few tasks run successfully, but future tasks go DEAD, users will need to check the experiment log files, located at ``$EXP_BASEDIR/ptmp/test/com/output/logs``. It may also be useful to check that the JEDI directory and other paths and values are correct in ``parm_xml.yaml``.
+
+
.. _RestartTask:
How do I restart a DEAD task?
@@ -27,15 +45,15 @@ On platforms that utilize Rocoto workflow software (including Hera and Orion), i
200001030000 post_anal 61746109 SUCCEEDED 0 1 4.0
200001030000 plot_stats 61746110 SUCCEEDED 0 1 70.0
200001030000 forecast 61746128 DEAD 256 1 -
+ 200001030000 plot_stats - - - - -
-This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (by referencing the log files in ``$LANDDAROOT/ptmp/test/com/output/logs``), users can re-run the failed task using the ``rocotorewind`` command:
+This means that the DEAD task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (e.g., by referencing the log files in ``$LANDDAROOT/ptmp/test/com/output/logs``), users can rewind, or "undo," the failed task using the ``rocotorewind`` command:
.. code-block:: console
rocotorewind -w land_analysis.xml -d land_analysis.db -v 10 -c 200001030000 -t forecast
where ``-c`` specifies the cycle date (first column of ``rocotostat`` output) and ``-t`` represents the task name
-(second column of ``rocotostat`` output). After using ``rocotorewind``, the next time ``rocotorun`` is used to
-advance the workflow, the job will be resubmitted.
+(second column of ``rocotostat`` output). This will set the number of tries to 0, as though the task has not been run. After using ``rocotorewind``, the next time ``rocotorun`` is used to advance the workflow, the job will be resubmitted.
diff --git a/doc/source/Reference/Glossary.rst b/doc/source/Reference/Glossary.rst
index bd3c1fcd..048b14e5 100644
--- a/doc/source/Reference/Glossary.rst
+++ b/doc/source/Reference/Glossary.rst
@@ -124,16 +124,16 @@ Glossary
Numerical Weather Prediction (NWP) takes current observations of weather and processes them with computer models to forecast the future state of the weather.
RDHPCS
- Research and Development High-Performance Computing Systems.
+ `Research and Development High-Performance Computing Systems `_.
Spack
- `Spack `_ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures.
+ `Spack `_ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers where many users and application teams share common installations of software on clusters with exotic architectures.
spack-stack
- The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the :jedi:`Joint Effort for Data assimilation Integration (JEDI) <>` framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.
+ The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the :ufs:`Unified Forecast System (UFS) <>` and the :jedi:`Joint Effort for Data assimilation Integration (JEDI) <>` framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.
UFS
- The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit https://ufscommunity.org/.
+ The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit the :ufs:`UFS Portal <>`.
Umbrella repository
A repository that houses external code, or “externals,” from additional repositories.
diff --git a/doc/source/Reference/Rocoto.rst b/doc/source/Reference/Rocoto.rst
index 24357472..0e9d6dd9 100644
--- a/doc/source/Reference/Rocoto.rst
+++ b/doc/source/Reference/Rocoto.rst
@@ -67,16 +67,16 @@ Executing this command will generate a workflow status table similar to the foll
200001030000 pre_anal druby://10.184.3.62:41973 SUBMITTING - 1 0.0
200001030000 analysis - - - - -
200001030000 post_anal - - - - -
- 200001030000 plot_stats - - - - -
200001030000 forecast - - - - -
+ 200001030000 plot_stats - - - - -
================================================================================================================================
200001040000 prep_obs druby://10.184.3.62:41973 SUBMITTING - 1 0.0
200001040000 pre_anal - - - - -
200001040000 analysis - - - - -
200001040000 post_anal - - - - -
- 200001040000 plot_stats - - - - -
200001040000 forecast - - - - -
-
+ 200001040000 plot_stats - - - - -
+
This table indicates that the ``prep_obs`` task for cycle 200001030000 was sent to the batch system and is now queued, while the ``pre_anal`` task for cycle 200001030000 and the ``prep_obs`` task for cycle 200001040000 are currently being submitted to the batch system.
Note that issuing a ``rocotostat`` command without an intervening ``rocotorun`` command will not result in an updated workflow status table; it will print out the same table. It is the ``rocotorun`` command that updates the workflow database file (in this case ``land_analysis.db``, located in ``parm``). The ``rocotostat`` command reads the database file and prints the table to the screen. To see an updated table, the ``rocotorun`` command must be executed first, followed by the ``rocotostat`` command.
@@ -87,19 +87,19 @@ After issuing the ``rocotorun`` command several times (over the course of severa
CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION
============================================================================================
- 200001030000 prep_obs 18347451 SUCCEEDED 0 1 3.0
- 200001030000 pre_anal 18347452 SUCCEEDED 0 1 5.0
- 200001030000 analysis 18347525 SUCCEEDED 0 1 65.0
- 200001030000 post_anal 18347558 SUCCEEDED 0 1 10.0
- 200001030000 plot_stats 18347559 SUCCEEDED 0 1 73.0
- 200001030000 forecast 18347562 SUCCEEDED 0 1 103.0
- ==========================================================================================
- 200001040000 prep_obs 18347453 SUCCEEDED 0 1 3.0
- 200001040000 pre_anal 18347568 SUCCEEDED 0 1 4.0
- 200001040000 analysis 18347584 SUCCEEDED 0 1 70.0
- 200001040000 post_anal 18347591 SUCCEEDED 0 1 4.0
- 200001040000 plot_stats 18347592 SUCCEEDED 0 1 48.0
- 200001040000 forecast 18347593 RUNNING - 1 0.0
+ 200001030000 prep_obs 1131735 SUCCEEDED 0 1 1.0
+ 200001030000 pre_anal 1131736 SUCCEEDED 0 1 5.0
+ 200001030000 analysis 1131754 SUCCEEDED 0 1 33.0
+ 200001030000 post_anal 1131811 SUCCEEDED 0 1 11.0
+ 200001030000 forecast 1131918 SUCCEEDED 0 1 31.0
+ 200001030000 plot_stats 1131944 SUCCEEDED 0 1 26.0
+ ============================================================================================
+ 200001040000 prep_obs 1131737 SUCCEEDED 0 1 2.0
+ 200001040000 pre_anal 1131945 SUCCEEDED 0 1 3.0
+ 200001040000 analysis 1132118 SUCCEEDED 0 1 29.0
+ 200001040000 post_anal 1132174 SUCCEEDED 0 1 10.0
+ 200001040000 forecast 1132186 SUCCEEDED 0 1 31.0
+ 200001040000 plot_stats 1132319 RUNNING - 1 0.0
When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log file for each task is located in ``$LANDDAROOT/ptmp/test/com/output/logs``. If any task fails, the corresponding log file can be checked for error messages. Optional arguments for the ``rocotostat`` command can be found in the `Rocoto documentation `_.
diff --git a/doc/source/conf.py b/doc/source/conf.py
index 294e48cc..b5b98ffc 100644
--- a/doc/source/conf.py
+++ b/doc/source/conf.py
@@ -7,7 +7,7 @@
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "UFS Offline Land DA User's Guide"
-copyright = '2023, '
+copyright = '2024, '
author = ' '
# The short X.Y version
@@ -47,9 +47,9 @@
# Documentation-wide substitutions
rst_prolog = """
-.. |latestr| replace:: v1.2.0
-.. |tag| replace:: ``ufs-land-da-v1.2.0``
-.. |branch| replace:: ``release/public-v1.2.0``
+.. |latestr| replace:: v2.0.0
+.. |tag| replace:: ``ufs-land-da-v2.0.0``
+.. |branch| replace:: ``release/public-v2.0.0``
.. |skylabv| replace:: Skylab v7.0
.. |spack-stack-ver| replace:: v1.6.0
"""
@@ -64,6 +64,8 @@
linkcheck_ignore = [r'https://www\.intel\.com/content/www/us/en/developer/tools/oneapi/hpc\-toolkit\-download\.html',
r'https://doi.org/10.1029/.*',
r'https://doi.org/10.1002/.*',
+ r'https://doi.org/10.5281/zenodo.13909475', # DOI not published until release
+ r'https://sourceforge.net/projects/xming/',
]
# Ignore anchor tags for Land DA data bucket. Shows Not Found even when they exist.
@@ -131,5 +133,6 @@ def setup(app):
'land-wflow-wiki': ('https://github.com/ufs-community/land-DA_workflow/wiki/%s','%s'),
'spack-stack': ('https://spack-stack.readthedocs.io/en/1.6.0/%s', '%s'),
'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/develop/%s', '%s'),
+ 'ufs': ('https://ufs.epic.noaa.gov/%s', '%s'),
'uw': ('https://uwtools.readthedocs.io/en/main/%s', '%s'),
}
diff --git a/doc/source/doc-snippets/load-env.rst b/doc/source/doc-snippets/load-env.rst
new file mode 100644
index 00000000..9a70e491
--- /dev/null
+++ b/doc/source/doc-snippets/load-env.rst
@@ -0,0 +1,8 @@
+.. code-block:: console
+
+ cd $LANDDAROOT/land-DA_workflow
+ module use modulefiles
+ module load wflow_
+ conda activate land_da
+
+where ```` is ``hera``, ``orion``, or ``hercules``.
\ No newline at end of file
diff --git a/doc/source/index.rst b/doc/source/index.rst
index 69855368..207e0d6c 100644
--- a/doc/source/index.rst
+++ b/doc/source/index.rst
@@ -1,7 +1,7 @@
-.. UFS Offline LandDA documentation master file, created by
+.. UFS Land DA documentation master file, created by
sphinx-quickstart on Fri Jan 20 10:35:26 2023.
-UFS Offline Land DA User's Guide |release|
+UFS Land DA User's Guide |release|
============================================
.. toctree::