WRF Tutorial: Difference between revisions

From Fluids Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 1: Line 1:
Super unfinished! Will update soon.
Super unfinished! Will update soon.


This is a guide for configuring, compiling, and running WRF and the WRF Preprocessing System (WPS) using Sharcnet machines. This guide is meant to supplement the [http://www2.mmm.ucar.edu/wrf/OnLineTutorial/index.htm WRF User Tutorial]. Graham has pre-compiled WRF modules available. However, you may find yourself needing to compile WRF and WPS yourself. Both of these options, along with supplementary troubleshooting material, can be found below.
This is a guide for configuring, compiling, and running WRF and the WRF Preprocessing System (WPS) using Sharcnet machines for '''real data''' simulations. This guide is meant to supplement the [http://www2.mmm.ucar.edu/wrf/OnLineTutorial/index.htm WRF User Tutorial]. Graham has pre-compiled WRF modules available. However, you may find yourself needing to compile WRF and WPS yourself. Both of these options, along with supplementary troubleshooting material, can be found below.


== Using Pre-compiled Modules ==
== Using Pre-compiled Modules ==
Line 17: Line 17:
  WPS/configure.wps
  WPS/configure.wps


== Compiling WRF ==
== Compiling WRF and WPS ==
 
=== Setting the netCDF variable ===
 
WRF and WPS need to know the path to the proper netCDF directory. You will need to set the netCDF path to the '''fortran-mpi''' netCDF folder. On graham, this can be done by typing the following:
export NETCDF=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/netcdf-fortran-mpi/4.4.4
Check to make sure the following file exists: <code>/include/netcdf.inc</code>.
 
At this point, you may also choose to allow for large (>2GB) netCDF files. This is recommended for any high-resolution WRF simulations.
export WRFIO_NCD_LARGE_FILE_SUPPORT=1
 
=== Compiling WRF ===
 
In the WRF or WRFV3 directory, type: <code>./configure</code>. You will be given a list of computer options. Select '''intel, ifort/icc (dmpar)'''. This is option 15 for WRF 4.0.
 
In the next section, you may wish to compile for nesting. Option 1, which covers all basic nests, is recommended.
 
This will create a <code>configure.wrf</code> file, containing configure options based on your environment. In very unlikely cases, some paths may be incorrect. You can edit paths in this file to their correct paths before continuing.
 
To compile for real-data cases, type <code>./compile em_real</code>. This will take about an hour, and will create the following executables:
main/ndown.exe
main/wrf.exe
main/real.exe
 
=== Compiling WPS ===
 
Make sure the netCDF variable is still set from before. In the WPS directory, type <code>./configure</code>. You will be given a list of computer options. Choose '''linux/intel/dmpar'''. This will create a <code>configure.wps</code> file, containing configure options based on your environment. If paths are incorrect, you can edit paths in this file to their correct paths before continuing. This includes the path to your WRF/WRFV3 directory.
 
To compile WPS, type <code>./compile</code>. This will create the following executables:
geogrid/src/geogrid.exe
metgrid/src/metgrid.exe
ungrib/src/ungrid.exe


== Acquiring Data ==
== Acquiring Data ==
Line 30: Line 61:
=== GRIB Data ===
=== GRIB Data ===


Download reanalysis data for your simulation preferred dates. It's suggested to use NARR-A data using the HAS data access link on the [https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/north-american-regional-reanalysis-narr NOAA Website]. Place this data in the <code>/DATA</code> directory.  
Download reanalysis data for your simulation preferred dates. It's suggested to use NARR-A data using the HAS data access link on the [https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/north-american-regional-reanalysis-narr NOAA Website]. Place this data in the <code>/DATA</code> directory. In the <code>/WPS</code> directory, you will have to link the path from WPS to DATA using:
  ./link_grib.csh path/to/data


If you're using NARR data, you will need to link the appropriate Variable Table. In the <code>/WPS</code> directory, type:  
If you're using NARR data, you will need to link the appropriate Variable Table. In the <code>/WPS</code> directory, type:  
Line 36: Line 68:


== Running WPS ==
== Running WPS ==
There are three components to WPS:
* ungrib.exe takes static GRIB data and turns it into an intermediate file format.
* geogrid.exe takes static geographical data and fits it to your specified grid.
* metgrid.exe takes output from geogrid.exe and ungrib.exe and interpolates the data to your domain for your specified times.
Ungrib and geogrid can be run independently of each other. Re-running one does not mean you have to re-run the other. However, metgrid must be run after both ungrid and geogrid. If you re-run one of geogrid or ungrib, then you will have to re-run metgrid.
Edit namelist.wps to specify your desired domain. A list of best practices for the namelist can be found [http://www2.mmm.ucar.edu/wrf/users/namelist_best_prac_wps.html here].
Since these three executables all take a small amount of time and computing power, it is helpful to submit them in the same job. At the end of your WPS [[Graham_Tips#Submit_script|submit script]], put the following:
srun ./geogrid.exe
srun ./ungrib.exe
srun ./metgrid.exe


== Running WRF ==
== Running WRF ==

Revision as of 11:00, 18 May 2018

Super unfinished! Will update soon.

This is a guide for configuring, compiling, and running WRF and the WRF Preprocessing System (WPS) using Sharcnet machines for real data simulations. This guide is meant to supplement the WRF User Tutorial. Graham has pre-compiled WRF modules available. However, you may find yourself needing to compile WRF and WPS yourself. Both of these options, along with supplementary troubleshooting material, can be found below.

Using Pre-compiled Modules

The current (as of May 2018) configuration of WRF and WPS that work together on Graham are WPS 3.8.1 and WRF 3.8.1. Copy the following directories into WRF and WPS directories in your desired ~/projects directory:

/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/wrf/3.8.1/WRF
/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/wps/3.8.1/WPS

To use the modules, type:

module load wrf/3.8.1
module load wps/3.8.1

at the start of each session. It it helpful to add these lines to your ~/.bashrc.

To view the options that were used when compiling to configure WRF and WPS, see:

WRFV3/configure.wrf
WPS/configure.wps

Compiling WRF and WPS

Setting the netCDF variable

WRF and WPS need to know the path to the proper netCDF directory. You will need to set the netCDF path to the fortran-mpi netCDF folder. On graham, this can be done by typing the following:

export NETCDF=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/netcdf-fortran-mpi/4.4.4

Check to make sure the following file exists: /include/netcdf.inc.

At this point, you may also choose to allow for large (>2GB) netCDF files. This is recommended for any high-resolution WRF simulations.

export WRFIO_NCD_LARGE_FILE_SUPPORT=1

Compiling WRF

In the WRF or WRFV3 directory, type: ./configure. You will be given a list of computer options. Select intel, ifort/icc (dmpar). This is option 15 for WRF 4.0.

In the next section, you may wish to compile for nesting. Option 1, which covers all basic nests, is recommended.

This will create a configure.wrf file, containing configure options based on your environment. In very unlikely cases, some paths may be incorrect. You can edit paths in this file to their correct paths before continuing.

To compile for real-data cases, type ./compile em_real. This will take about an hour, and will create the following executables:

main/ndown.exe
main/wrf.exe
main/real.exe

Compiling WPS

Make sure the netCDF variable is still set from before. In the WPS directory, type ./configure. You will be given a list of computer options. Choose linux/intel/dmpar. This will create a configure.wps file, containing configure options based on your environment. If paths are incorrect, you can edit paths in this file to their correct paths before continuing. This includes the path to your WRF/WRFV3 directory.

To compile WPS, type ./compile. This will create the following executables:

geogrid/src/geogrid.exe
metgrid/src/metgrid.exe
ungrib/src/ungrid.exe

Acquiring Data

To run WRF using real-data, you will need to acquire both static geography data and GRIB (General Regularly-distributed Information in Binary form) data that describes the weather conditions.

Geography Data

Create and navigate to a /geog directory and download the high-resolution geography data found here:

wget http://www2.mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html

You will need to put the appropriate path to this data in your namelist.wps.

GRIB Data

Download reanalysis data for your simulation preferred dates. It's suggested to use NARR-A data using the HAS data access link on the NOAA Website. Place this data in the /DATA directory. In the /WPS directory, you will have to link the path from WPS to DATA using:

 ./link_grib.csh path/to/data

If you're using NARR data, you will need to link the appropriate Variable Table. In the /WPS directory, type:

ln -sf ungrib/Variable_Tables/Vtable.NARR Vtable

Running WPS

There are three components to WPS:

  • ungrib.exe takes static GRIB data and turns it into an intermediate file format.
  • geogrid.exe takes static geographical data and fits it to your specified grid.
  • metgrid.exe takes output from geogrid.exe and ungrib.exe and interpolates the data to your domain for your specified times.

Ungrib and geogrid can be run independently of each other. Re-running one does not mean you have to re-run the other. However, metgrid must be run after both ungrid and geogrid. If you re-run one of geogrid or ungrib, then you will have to re-run metgrid.

Edit namelist.wps to specify your desired domain. A list of best practices for the namelist can be found here.

Since these three executables all take a small amount of time and computing power, it is helpful to submit them in the same job. At the end of your WPS submit script, put the following:

srun ./geogrid.exe
srun ./ungrib.exe
srun ./metgrid.exe


Running WRF

Useful Tips

Like ncview, ncdump, ncks, TSlist, bash scripts for TSlist, plotgrids_old, etc.

Troubleshooting Resources

All the WRF errors! Debug level!