WRF Tutorial: Difference between revisions

From Fluids Wiki
Jump to navigation Jump to search
No edit summary
Line 91: Line 91:
Run <code>./real.exe</code> and then <code>./wrf.exe</code>.  
Run <code>./real.exe</code> and then <code>./wrf.exe</code>.  


If you're running a high-resolution or large-domain simulation, you may run into memory allocation errors when defining the grid. It is useful to use [[Graham_Tips#Complete_Nodes|complete nodes]] with <code>#SBATCH --mem=125G</code> to run <code>./wrf.exe</code>.
If you're running a high-resolution or large-domain simulation, you may run into memory allocation errors when defining the grid. It is useful to submit your job using [[Graham_Tips#Complete_Nodes|complete nodes]] with <code>#SBATCH --mem=125G</code> to run <code>./wrf.exe</code>.


== Post Processing ==
== Post Processing ==

Revision as of 12:27, 18 May 2018

This is a guide for configuring, compiling, and running WRF and the WRF Preprocessing System (WPS) using Sharcnet machines for real data simulations. This guide is meant to supplement the WRF User Tutorial. Graham has pre-compiled WRF modules available. However, you may find yourself needing to compile WRF and WPS yourself. Both of these options, along with obtaining real data, useful tips, and troubleshooting material, can be found below.

Using Pre-compiled Modules

The current (as of May 2018) configuration of WRF and WPS that work together on Graham are WPS 3.8.1 and WRF 3.8.1. Copy the following directories into WRF and WPS directories in your desired ~/projects directory:

/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/wrf/3.8.1/WRF
/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/wps/3.8.1/WPS

To use the modules, type:

module load wrf/3.8.1
module load wps/3.8.1

at the start of each session. If you run WRF and WPS often, it is helpful to add these lines to your ~/.bashrc.

To view the options that were used when compiling to configure WRF and WPS, see:

WRFV3/configure.wrf
WPS/configure.wps

Compiling WRF and WPS

Setting the netCDF variable

WRF and WPS need to know the path to the proper netCDF directory. You will need to set the netCDF path to the fortran-mpi netCDF folder. On graham, this can be done by typing the following:

export NETCDF=/cvmfs/soft.computecanada.ca/easybuild/software/2017/avx2/MPI/intel2016.4/openmpi2.1/netcdf-fortran-mpi/4.4.4

Check to make sure the following file exists: /include/netcdf.inc.

At this point, you may also choose to allow for large (>2GB) netCDF files. This is recommended for any high-resolution WRF simulations.

export WRFIO_NCD_LARGE_FILE_SUPPORT=1

Compiling WRF

In the WRF or WRFV3 directory, type: ./configure. You will be given a list of computer options. Select intel, ifort/icc (dmpar). This is option 15 for WRF 4.0.

In the next section, you may wish to compile for nesting. Option 1, which covers all basic nests, is recommended.

This will create a configure.wrf file, containing configure options based on your environment. In very unlikely cases, some paths may be incorrect. You can edit paths in this file to their correct paths before continuing.

To compile for real-data cases, type ./compile em_real. This will take about an hour, and will create the following executables:

main/ndown.exe
main/wrf.exe
main/real.exe

Compiling WPS

Make sure the netCDF variable is still set from before. In the WPS directory, type ./configure. You will be given a list of computer options. Choose linux/intel/dmpar. This will create a configure.wps file, containing configure options based on your environment. If paths are incorrect, you can edit paths in this file to their correct paths before continuing. This includes the path to your WRF/WRFV3 directory.

To compile WPS, type ./compile. This will create the following executables:

geogrid/src/geogrid.exe
metgrid/src/metgrid.exe
ungrib/src/ungrid.exe

Acquiring Data

To run WRF using real-data, you will need to acquire both static geography data and GRIB (General Regularly-distributed Information in Binary form) data that describes the weather conditions.

Geography Data

Create and navigate to a /geog directory and download the high-resolution geography data found here:

wget http://www2.mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html

You will need to put the appropriate path to this data in your namelist.wps.

GRIB Data

Download reanalysis data for your simulation preferred dates. It's suggested to use NARR-A data using the HAS data access link on the NOAA Website. Place this data in the /DATA directory. In the /WPS directory, you will have to link the path from WPS to DATA using:

 ./link_grib.csh path/to/data

If you're using NARR data, you will need to link the appropriate Variable Table. In the /WPS directory, type:

ln -sf ungrib/Variable_Tables/Vtable.NARR Vtable

Running WPS

There are three components to WPS:

  • ungrib.exe takes static GRIB data and turns it into an intermediate file format.
  • geogrid.exe takes static geographical data and fits it to your specified grid.
  • metgrid.exe takes output from geogrid.exe and ungrib.exe and interpolates the data to your domain for your specified times.

Ungrib and geogrid can be run independently of each other. Re-running one does not mean you have to re-run the other. However, metgrid must be run after both ungrid and geogrid. If you re-run one of geogrid or ungrib, then you will have to re-run metgrid.

Edit namelist.wps to specify your desired domain. A list of best practices for the namelist can be found here. If you're running with NARR data, set interval_seconds = 10800. To preview a map of your domain, use ncl:

module load ncl
ncl util/plotgrids_old.ncl

Since these three executables all take a small amount of time and computing power, it is helpful to submit them in the same job. At the end of your WPS submit script, put the following:

srun ./geogrid.exe
srun ./ungrib.exe
srun ./metgrid.exe

Check your *.log files for errors. A met_em file should be created for each time in your simulation: met_em.d01.YYYY-MM-DD_HH:00:00.nc.

Running WRF

Run WRF from either your /WRF/test/em_real or /WRF/real directory. Link your met_em files to this directory:

 ln -sf path_to_met_em_files/met_em.d0* . 

Edit the namelist.input to match the WPS domain and choose physics parameterizations. Best practices can be found here. Run ./real.exe and then ./wrf.exe.

If you're running a high-resolution or large-domain simulation, you may run into memory allocation errors when defining the grid. It is useful to submit your job using complete nodes with #SBATCH --mem=125G to run ./wrf.exe.

Post Processing

For a quick look at your data (during or after a run), use ncview:

module load ncview
ncview wrfout_d01_YYYY-MM-DD_HH:00:00

For a more detailed look at your data and for creating nicer graphics, you can use NCL or VisIt. To use VisIt, append a .nc to the end of your wrfout files:

cp wrfout_d01_YYYY-MM-DD_HH:00:00 wrfout_d01_YYYY-MM-DD_HH:00:00.nc

Useful Tips

Extracting time series from a specific location

WRF has the capability to easily record time series at specific station locations using TSlist for surface variables: t, q, u, v, psfc, glw, gsw, hfx, lh, tsk, tslb, rainc, rainnc, clw, along with vertical profiles at the given location for u, v, potential temperature, geopotential height, and water vapour mixing ratio. In the directory that you run WRF, edit the tslist file to contain a station name, prefix, latitude, and longitude of your given location. This must be done before WRF is run.

After WRF is run, the following files containing time series information for the station locations will have been created:

pfx.dNN.TS, pfx.dNN.UU, pfx.dNN.VV, pfx.dNN.TH, pfx.dNN.PH, pfx.dNN.QV

To extract time series information as an array for a specific variable in pfx.dNN.TS, it is helpful to use the following bash script, wrfTimeSeriesToArray.sh:

if [ $# -ne 4 ]
 then
   echo "Required arguments are FILE_NAME COLUMN_NUMBER STARTING_ROW OUTFILE_NAME"
   echo "If the first column starts with the delimiter, you'll have to add 1 to the COLUMN_NUMBER"
   echo "Eg. For u10 in a prefix.TS, type: bash wrfTimeSeriesToArray.sh *.d01.TS 9 2 'prefix-u10'"
   exit
fi
output=($(awk -v SR=$3 -v COL=$2  -F "space:+" 'SR<=NR{ if($COL=="") print BLANK; else print $COL; }' $1))
echo "${output[@]}" >> $4

Reducing the size of your output files

Since wrfout files contain a huge number of variables, they can be incredibly huge. This makes storing and transferring them a difficult task. However, using ncks, it is easy to create a new netCDF file containing only your variables of interest.

To view all the variables that are stored in your netCDF wrfout file, use:

ncdump -h wrfout_d01_YYYY-MM-DD_HH:00:00

You will need to keep the variables Times, XLAT, XLONG, XTIME, along with your specific variables of interest. For example, to create a new netCDF file containing only variables Times, XLAT, XLONG, XTIME, U, V, and W, type:

module load nco
ncks -v Times,XLAT,XLONG,XTIME,U,V,W original_wrfout_filename new_filename

Troubleshooting Resources

Debug level

When real.exe and wrf.exe don't run properly, take a look in the rsl.out.0000 and rsl.error.0000 files for error outputs. However, sometimes the outputs contained in these files don't specify where something goes wrong. To increase the number of outputs printed to rsl.out.0000 and rsl.error.0000, change the debug_level variable in namelist.input. This value can range from 0 to 1000. The larger the number, the more outputs printed to the files. It is worth noting that large debug_levels can cause the code to run slowly, or not at all. A safe debug_level number is around 100.