WRFand WRF-CHEM

From mn/geo/geoit
Revision as of 13:39, 30 September 2014 by Annefou@uio.no (talk | contribs)

Jump to: navigation, search

Weather Research and Forecasting Model

WRF and WRF-CHEM have been installed on Abel. To check which version is available:


module avail wrf

-------------------------------------------------------------------------------------- /cluster/etc/modulefiles ---------------------------------------------------------------------------------------
wrf/3.6.1


To set-up your environment:


module load wrf/3.6.1


When loading wrf, the following environment variables are defined:

  • WRF_HOME: directory where WRF has been installed; this can be used to check WRF sources
  • WPS_HOME: directory where WPS has been installed. It is unlikely you have to check or change WPS source code but you will find in this directory all the metadata required to run WPS (the GEOGRID.TBL, METGRID.TBL, and Vtable files).
  • WPS_GEOG_PATH: contains theterrestrial static input data. Even if you use your own compiled version of WRF/WPS, it is NOT necessary to download these files again.
  • WRFIO_NCD_LARGE_FILE_SUPPORT: it is set to 1 to allow you to store large netCDF files (more than 2GB).
  • WRF_EXAMPLES: directory where all WRF and WRF-CHEM examples are stored. If you wish to run one of this example see our dedicated section on running tutorials.

 

WRF has been compiled with intel compilers and MPI:

module list 

Currently Loaded Modulefiles:
  1) use.own                 3) openmpi.intel/1.6.1     5) netcdf.intel/4.2.1.1    7) jasper/1.900.1          9) openmpi.intel/1.8      11) wrf/3.6.1
  2) intel/2011.10           4) hdf5/1.8.9_intel        6) intel-libs/2013.sp1.3   8) intel/2013.sp1.3       10) ncl/6.2.0


We suggest you specify the version you wish to use to avoid any problems if we install a new default version (as it is important to stick to the very same version for your  simulations).


Running WRF:

The following steps are necessary to run WRF on our systems:

  1. WRF Preprocessing System (WPS)
  2. WRF System


WRF Preprocessing System (WPS)

Most parameters for WPS must be given in namelist.wps

  • define_grid.py uses namelist.wps to plot your defined grid
  • run_geogrid.py  runs geogrid.exe to create static data for your domain
  • run_ungrib.py runs ungrib.exe to unpack your input GRIB data
  • run_metgrid.py runs metgrid.exe to interpolate input data onto your model domain


WRF

Most parameters for WRF must be given in namelist.input

  • run_init_wrf.py runs real.exe to initialize WRF and creates two files such as wrfinput_d<domain> and wrfbdy_d<domain> where <domain> is the domain number (01, 02, etc.)
  • run_wrf.py runs the numerical integration program wrf.exe


TIPS: to get the usage of these python scripts, use -h or --help. For instance:

run_ungrib.py -h
Usage: run_ungrib.py --expid expid --vtable vtable --datadir datadir [--start_date startdate --end_date enddate] [--interval_seconds val] [--prefix FILE]

Options:
  -h, --help            show this help message and exit
  -s startdate, --start_date=startdate
                         A list of MAX_DOM character strings of the form
                        'YYYY-MM-DD_HH:mm:ss' specifying the starting UTC date
                        of the simulation for each domain.
  -e enddate, --end_date=enddate
                        A list of MAX_DOM character strings of the form 'YYYY-
                        MM-DD_HH:mm:ss' specifying the ending UTC date of the
                        simulation for each domain
  -t interval_seconds, --interval_seconds=interval_seconds
                        The integer number of seconds between time-varying
                        meteorological input files. No default value.
  -p FILE, --prefix=FILE
                        prefix of the intermediate meteorological data files
  -v Vtable, --vtable=Vtable
                        vtable filename
  -i expid, --expid=expid
                        Experiment identifier
  -d datadir, --datadir=datadir
                        Directory where input fields can be found

For each of these python scripts, some arguments are optionals and indicated in squared brackets such as start_date and end_date. If you don't specify them on the command line, it will keep what has been defined in your namelist.  This is usually how you will run your "real" cases.

WRF examples:

All WRF examples can be found in $WRF_EXAMPLES

A subdirectory can be found for each example and it contains all you need to run the corresponding examples (namelist.wps, namelist.input, Vtable, workflow.bash, and all the DATA required to run the simulation). You may found one or more files named workflow*.bash. These files are describing the sequence of programs to run for each example and can be divided in two groups:


January2000Case

This case is the East Coast Winter Storm of January 24-25, 2000 (http://cimss.ssec.wisc.edu/goes/misc/000125.html)

To run it, you need to execute each of the statement in workflow.bash:


> cat workflow.bash
#!/bin/bash

# check your domain is OK
define_grid.py --path /cluster/software/VERSIONS/wrf/examples/January2000Case

# Run geogrid.exe to create static data for this domain:
run_geogrid.py -p /cluster/software/VERSIONS/wrf/examples/January2000Case --expid January2000Case

# Unpack input GRIB data (ungrib.exe)
run_ungrib.py --expid January2000Case             \
              --start_date 2000-01-24_12:00:00    \
              --end_date 2000-01-25_12:00:00      \
              --interval_seconds 21600            \
              --prefix FILE                       \
              --vtable  /cluster/software/VERSIONS/wrf/examples/January2000Case/Vtable \
              --datadir /cluster/software/VERSIONS/wrf/examples/January2000Case/DATA/


#Interpolate the input data onto our model domain (metgrid.exe)
run_metgrid.py --expid January2000Case

# Initialize WRF model (real.exe/ideal.exe)
run_init_wrf.py --expid January2000Case \
                --namelist /cluster/software/VERSIONS/wrf/examples/January2000Case/namelist.input

# Run the model (wrf.exe)
run_wrf.py --expid January2000Case \
           --namelist /cluster/software/VERSIONS/wrf/examples/January2000Case/namelist.input


You don't have to change paths or namelists. it has been set-up to execute WPS/WRF in your workdir ($WORKDIR) and a subdirectory named January2000Case (named from your experiment identifier given with --expid option).

To visualize your outputs (named wrfout*), you may use ncview:

          module load ncview

          ncview wrfout_d01_2000-01-24_12:00:00


and select the variables you wish to plot.  For instance, to visualize SST:

Ncview-p1.png

Ncview-p2.png


TIPS: ncview is not recommended for quality graphical displays, but is a very handy tool for a quick first-look at the data.


HurricaneKatrina 

As for previous examples, you just need to run workflow.bash and check the directory $WORKDIR/HurricaneKatrina

On August 28, 2005, Hurricane Katrina was in the Gulf of Mexico, where it strengthened to a Category 5 storm on the Saffir-Simpson hurricane scale, packing winds estimated at 175 mph. (http://www.katrina.noaa.gov/).

Katrina-08-28-2005 small.jpg

Hurricane Katrina on August 28, 2005 (image taken

from http://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/SingleDomain/)

HurricaneKatrinaSST 

The goal here is to input SST into WRF model. For these runs we will use the Hurricane Katrina case data (2005-08-28_00 to 2005-08-29_00).

SST are typically added to the model:

a. Use the SST at the initial time as a constant field for all time periods (this is good for short runs, like real-time runs, where SST is not updated during the WRF model run)
b. As an extra input at each model input time (this is good for long -months- model runs)

NestedModelRuns 

For these runs we will use the Katrina Hurricane case data (2005-08-28_00 to 2005-08-29_00).

The domain we are going to set up is show below (image taken from http://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/NestRuns/index.html).

Domain-nested.png


There are a number of different ways to set up nested model runs (in this tutorial we are only going to set-up 2-way interactive nested runs).

a. Two-way nested run, with one input file
The preprocessing steps for this case will be similar to a single domain setup. The only difference, is that during the wrf.exe execution, a second (or more) nest(s) is initiated. The corresponding workflow can be found in workflow_a.bash
b. Two-way nested run, with two input files
For this case the pre-processing programs need to be run to create extra input data for the wrf model run. At the WRF model step, one has the choice to:
i. Use all the meteorological and static data for nested domains as input, (see workflow_b.bash) or
ii. Use only the static data for nested domains as input (see workflow_bb.bash).

c. One-way nesting using ndown
ndown is used to run one-way nested runs AFTER wrf has already been run for the mother domain.
One-way nesting can also be done similar to two-way nested runs (both a and b above), by simply setting feedback in the WRF namelist.input file equal to 0. The corresponding workflow can be found in workflow_c.bash
 

RestartRun

This case study we will use the same setup as for the Single Domain run, we will just restart it from the previous run. As for all other examples, run workflow.bash

April2005Case 

This example is showing how to run from ERA-Interim data instead of GFS data. As you can see in workflow.bash, another Vtable needs to be used (Vtable.EI).


WRF has been compiled with WRF_CHEM=1 and  WRF_KPP=1 and is therefore suitable for WRF-CHEM simulations. The three next examples shows how to use WRF-CHEM at UIO.


BiogenicEmissions 

This example uses WRF-CHEMand its goal is to get familiar with the methodology by which the MEGAN biogenic emissions are introduced into the WRF-Chem simulation.

This exercise is intended to be completed by students that have knowledge about setting-up and running the WRF numerical model.

There are two workflows:

  • option-1 (workflow-opt1.bash) uses GOCART-RACM_KPP aerosol option (chem_opt=301), Geunther biogenic emissions (bio_emiss_opt=1). dust, sea salt, DMS, and biomass burning will still be included so keep those options turned on.
  • option-2 (workflow-opt2.bash)

DustErosion2010 

This example uses WRF-CHEM. The purpose of this example is to get familiar with the methodology by which the dust erosion fields are introduced through the WRF Preprocessing System (WPS). The corresponding workflow is calledworkflow.bash

GOCARTaerosols 

This example uses WRF-CHEM.  A global emissions  data set was prepared by a program called "prep_chem_sources" and with this program anthropogenic emissions, GOCART background fields and biomass burning (wild fire) emissions was previously mapped to the user domain. In this exercise you will use the emissions data and follow the methodology for making a WRF-Chem forecast shown here. The corresponding workflow is called workflow.bash

Running "long" simulations on abel ==

SLURM batch system

For most of your runs, you will need to use SLURM batch system (on the interactive node, you can not use more than 30mn CPU and you usually need to run WRF in parallel).

Save your outputs on your local machine

When running WRF, your outputs will be stored in $WORKDIR and deleted after about 45 days. It is therefore important to save your outputs on your local machine.