-
Notifications
You must be signed in to change notification settings - Fork 119
Getting Started
git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git
cd ufs-srweather-app
Then, check out the submodules for the SRW application.
./manage_externals/checkout_externals
The checkout_externals script
uses the configuration file Externals.cfg
in the top level directory and will clone the regional workflow, pre-processing utilities, UFS Weather Model, and UPP source code into the appropriate directories under your regional_workflow and src directories.
See this link to determine the level of support for your platform and compiler. You will need to know if your platform is pre-configured, configurable, limited-test, or build-only.
If you are running on Stampede, see the Getting Started instructions for that machine here.
There are a number of prerequisite libraries that must be installed before building and running this application. If you are running on a pre-configured platform, those libraries are already installed on your system and this step may be skipped.
If you are not running on a pre-configured platform, you will need to build the required libraries. The NCEPLIBS-external wiki page is the starting point for building the libraries.
Once you have the prerequisite libraries built, the workflow must be able to find them, via the environment variable NCEPLIBS_DIR
. If you followed the instructions on the NCEPLIBS-external wiki page, the NCEPLIBS_DIR
should be set to $WORK/NCEPLIBS-ufs-v1.1.0
. On some systems (Stampede2, for example), you MUST source this file in your .bashrc or equivalent script in order for compute nodes to find it. For example, you would add the following to your .bashrc:
source $WORK/NCEPLIBS-ufs-v1.1.0/bin/setenv_nceplibs.sh
If you are on a pre-configured platform, there is an input data directory with sample initialization/lateral boundary condition (IC/LBC) data for one experiment (GST case) already prepared. The data can be found in the following locations on these machines:
On Cheyenne:
/glade/p/ral/jntp/UFS_SRW_app/staged_extrn_mdl_files
On Hera:
/scratch2/BMC/det/UFS_SRW_app/v1p0/model_data
On Jet:
/lfs4/BMC/wrfruc/FV3-LAM/model_data
On Orion:
/work/noaa/fv3-cam/UFS_SRW_app/v1p0/model_data
On Gaea:
/lustre/f2/pdata/esrl/gsd/ufs/ufs-srw-release-v1.0.0/staged_extrn_mdl_files
On WCOSS Cray:
/gpfs/hps3/emc/meso/noscrub/UFS_SRW_App/model_data
On WCOSS Dell:
/gpfs/dell2/emc/modeling/noscrub/UFS_SRW_App/model_data
If you are not on a pre-configured platform, you can download the sample data file from one of two locations provided below.
AWS S3 bucket:
https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar
EMC ftp site:
https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/gst_model_data.tar.gz
Once the data are staged, the path will need to be set in your config.sh
as described below in Section 5.
Instructions for loading the proper modules and/or setting the correct environment variables can be found in the env/
directory in files named build_\<platform>_\<compiler>.env
. The commands in those files can be directly copy-pasted to the command line or the file can sourced. You may need to modify certain variables such as the path to NCEP libraries for your individual platform or the use of setenv rather than export depending on your environment:
ls -l env/
-rw-rw-r-- 1 user ral 466 Jan 21 08:41 build_cheyenne_intel.env
-rw-rw-r-- 1 user ral 461 Jan 21 08:41 build_hera_intel.env
-rw-rw-r-- 1 user ral 543 Jan 21 08:41 build_jet_intel.env
...
Build the executables as follows:
mkdir build
cd build
Run cmake
to set up the Makefile
, then run make
cmake .. -DCMAKE_INSTALL_PREFIX=..
make -j4 >& build.out &
Output from the build will be in the ufs-srweather-app/build/build.out
file. If this step is successful, there should be twelve executables in ufs-srweather-app/bin
including an executable for the model NEMS.exe
.
Generating the workflow experiment requires three steps, including (1) Set experiment parameters in config.sh
, (2) Set Python and other environment parameters, and (3) Run the generate_FV3LAM_wflow.sh
script.
First, set experiment parameters in config.sh
.
cd ../regional_workflow/ush
cp config.community.sh config.sh
Edit config.sh
to set the machine you are running on to MACHINE
, use an account you can charge for ACCOUNT
, and set the name of the experiment EXPT_SUBDIR
.
MACHINE="your machine eg hera, cheyenne"
ACCOUNT="your account"
EXPT_SUBDIR="my_expt_name"
If you have access to the NOAA HPSS from the machine you are running on, those changes should be sufficient; however, if that is not the case (for example, on Cheyenne), or if you have pre-staged the initialization data, the following parameters should also be set:
USE_USER_STAGED_EXTRN_FILES="TRUE"
EXTRN_MDL_SOURCE_BASEDIR_ICS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_ICS=( "gfs.pgrb2.0p25.f000" )
EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/model_data/FV3GFS"
EXTRN_MDL_FILES_LBCS=( "gfs.pgrb2.0p25.f006" "gfs.pgrb2.0p25.f012" "gfs.pgrb2.0p25.f018" "gfs.pgrb2.0p25.f024" "gfs.pgrb2.0p25.f030" "gfs.pgrb2.0p25.f036" "gfs.pgrb2.0p25.f042" "gfs.pgrb2.0p25.f048" )
For pre-configured machines, see Section 2 (Prepare the build) for paths to where the path-to/model_data to access pre-staged IC/LBCs for the GST case can be set to.
Second, you will need to load the appropriate python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This python environment has already been set up on Level 1 platforms, and can be activated in the following way (when in /path-to-ufs-srweather-app/regional_workflow/ush):
source ../../env/wflow_<platform>.env
Third, generate the workflow:
./generate_FV3LAM_wflow.sh
The generated workflow will be in $EXPTDIR
, where EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}
. A log file called log.generate_FV3LAM_wflow
is generated by this step and can also be found in $EXPTDIR
.
If the Rocoto software is available on your platform, you can follow the steps in this section to run the workflow. If Rocoto is not available, the workflow can be run using stand-alone scripts described here.
An environment variable may be set to navigate to the $EXPTDIR more easily. If the login shell is bash, it can be set as follows:
export EXPTDIR=/path-to-experiment/directory
Or if the login shell is csh/tcsh, it can be set using:
setenv EXPTDIR /path-to-experiment/directory
Go to the experiment directory
cd $EXPTDIR
To run Rocoto using the script:
./launch_FV3LAM_wflow.sh
Once the workflow is launched with the launch_FV3LAM_wflow.sh
script, a log file named log.launch_FV3LAM_wflow
will be created (or appended to it if it already exists) in EXPTDIR
.
Or to manually call Rocoto:
First load the Rocoto module, depending on the platform used.
For Cheyenne:
module use -a /glade/p/ral/jntp/UFS_SRW_app/modules/
module load rocoto
For Hera or Jet:
module purge
module load rocoto
For Orion:
module purge
module load contrib rocoto
For Gaea:
module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles
module load rocoto/1.3.3
For WCOSS_DELL_P3:
module purge
module load lsf/10.1
module use /gpfs/dell3/usrx/local/dev/emc_rocoto/modulefiles/
module load ruby/2.5.1 rocoto/1.2.4
For WCOSS_DELL_P3:
module purge
module load xt-lsfhpc/9.1.3
module use -a /usrx/local/emc_rocoto/modulefiles
module load rocoto/1.2.4
Then manually call rocotorun
to launch the tasks that have all dependencies satisfied and rocotostat
to monitor the progress:
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
For automatic re-submission of the workflow (every 3 minutes), the following line can be added to the user's crontab (use "crontab -e" to edit the cron table; examples are for Cheyenne):
*/3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && ./launch_FV3LAM_wflow.sh
If you want to turn off entries in the crontab, you can comment them out using a "#" at the beginning of each line.
The Python plotting scripts are located under ufs-srweather-app/regional_workflow/ush/Python
directory. The plot_allvars.py
script plots the output from a single run, while the plot_allvars_diff.py
script plots the difference between two runs. If you are plotting the difference, the runs must be on the same domain and available for the same forecast hours.
Generate the python plots:
cd ufs-srweather-app/regional_workflow/ush/Python
The appropriate environment must be loaded to run the scripts, which require Python 3 with the scipy
, matplotlib
, pygrib
, cartopy
, and pillow
packages. This Python environment has already been set up on Level 1 platforms and can be activated as follows (Note: if you are using the batch submission scripts the environments are set for you and you do not need to set them on the command line prior to running the script - see further instructions below):
On Cheyenne:
module load ncarenv
ncar_pylib /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/python_graphics
On Hera and Jet,
module use -a /contrib/miniconda3/modulefiles
module load miniconda3
conda activate pygraf
On Orion:
module use -a /apps/contrib/miniconda3-noaa-gsl/modulefiles
module load miniconda3
conda activate pygraf
On Gaea:
module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles
module load miniconda3/4.8.3-regional-workflow
The Cartopy shape files are available on a number of Level 1 platforms in the following locations:
On Cheyenne:
/glade/p/ral/jntp/UFS_SRW_app/tools/NaturalEarth
On Hera:
/scratch2/BMC/det/UFS_SRW_app/v1p0/fix_files/NaturalEarth
On Jet:
/lfs4/BMC/wrfruc/FV3-LAM/NaturalEarth
On Orion:
/work/noaa/gsd-fv3-dev/UFS_SRW_App/v1p0/fix_files/NaturalEarth
On Gaea:
/lustre/f2/pdata/esrl/gsd/ufs/NaturalEarth
To run the Python plotting script for a single run, six command line arguments are required, including
- Cycle date/time in YYYYMMDDHH format
- Starting forecast hour in HHH format
- Ending forecast hour in HHH format
- Forecast hour increment in HHH format
- EXPT_DIR: Experiment directory where post-processed data are found EXPT_DIR/YYYYMMDDHH/postprd
- CARTOPY_DIR: Base directory of cartopy shapefiles with a file structure of CARTOPY_DIR/shapefiles/natural_earth/cultural/*.shp
An example of plotting output from a cycle generated using the sample experiment/workflow configuration in the config.community.sh
script (which uses the GFSv15p2 suite definition file) is as follows:
python plot_allvars.py 2019061500 6 48 6 /path-to/expt_dirs/test_CONUS_25km_GFSv15p2 /path-to/NaturalEarth
The output files (in .png format) will be located in the directory EXPTDIR/CDATE/postprd
, where in this case EXPTDIR
is /path-to/expt_dirs/test_CONUS_25km_GFSv15p2 and CDATE
is 2019061500.
To generate difference plots, the plot_allvars_diff.py script must be called with the following seven command line arguments:
- Cycle date/time (CDATE) in YYYYMMDDHH format
- Starting forecast hour
- Ending forecast hour
- Forecast hour increment
- The top level of the first experiment directory EXPTDIR1 containing the first set of post-processed data. The script will look for the data files in the directory EXPTDIR1/CDATE/postprd.
- The top level of the first experiment directory EXPTDIR2 containing the second set of post-processed data. The script will look for the data files in the directory EXPTDIR2/CDATE/postprd.
- The base directory CARTOPY_DIR of the cartopy shapefiles. The script will look for the shape files (*.shp) in the directory CARTOPY_DIR/shapefiles/natural_earth/cultural.
An example of plotting differences from two experiments for the same date and predefined domain where one uses the “FV3_GFS_v15p2” suite definition file (SDF) and one using the “FV3_RRFS_v1alpha” SDF is as follows:
python plot_allvars_diff.py 2019061518 6 18 3 /path-to/expt_dirs1/test_CONUS_3km_GFSv15p2 /path-to/expt_dirs2/test_CONUS_3km_RRFSv1alpha /path-to/NaturalEarth
In this case, the output png files will be located in the directory EXPTDIR1/CDATE/postprd
.
If the Python scripts are being used to create plots of multiple forecast lead times and forecast variables, then you may need to submit them to the batch system. Example scripts are provided called sq_job.sh and sq_job_diff.sh for use on a platform such as Hera that uses the Slurm job scheduler or qsub_job.sh and qsub_job_diff.sh for use on a platform such as Cheyenne that uses PBS as the job scheduler.
At a minimum, the account should be set appropriately prior to job submission:
#SBATCH --account=an_account
Depending on the platform you are running on, you may also need to adjust the settings to use the correct Python environment and path to the shape files.
When using these batch scripts, several environment variables must be set prior to submission. If plotting output from a single cycle, the variables to set are HOMErrfs
and EXPTDIR
. In this case, if the user’s login shell is csh/tcsh, these variables can be set as follows:
setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow or export HOMErrfs=/path-to/ufs-srweather-app/regional_workflow
setenv EXPTDIR /path-to/EXPTDIR or export EXPTDIR=/path-to/EXPTDIR
If plotting the difference between the same cycle from two different experiments, the variables to set are HOMErrfs
, EXPTDIR1
, and EXPTDIR2
. In this case, if the user’s login shell is csh/tcsh, these variables can be set as follows:
setenv HOMErrfs /path-to/ufs-srweather-app/regional_workflow or export HOMErrfs=/path-to/ufs-srweather-app/regional_workflow
setenv EXPTDIR1 /path-to/EXPTDIR1 or export EXPTDIR1=/path-to/EXPTDIR1
setenv EXPTDIR2 /path-to/EXPTDIR2 or export EXPTDIR2=/path-to/EXPTDIR2
In addition, the variables CDATE, FCST_START, FCST_END, and FCST_INC in the batch scripts can be modified depending on the user’s needs.
export CDATE=${DATE_FIRST_CYCL}${CYCL_HRS}
export FCST_START=6
export FCST_END=${FCST_LEN_HRS}
export FCST_INC=6
The scripts must be submitted using the command appropriate for the job scheduler used on your platform. For example, on Hera, sq_job.sh
can be submitted as follows:
sbatch sq_job.sh
On Cheyenne, qsub_job.sh
can be submitted as follows:
qsub qsub_job.sh
The output png files will be located in the experiment directory EXPTDIR/CDATE/postprd
.
For more detailed information on the application, see the UFS Short-Range Weather App Users Guide.
- Getting Started for Developers
- Repository Structure and Submodules
- Contributor's Guide
- Code Reviewer's Guide
- UFS offline Land Data Assimilation (DA) System
- Global Workflow
- UFS Hurricane Analysis and Forecast System
- UFS Medium-Range Weather Application (no longer supported)
- spack-stack - builds bundled library dependencies using a Spack-based package installation method