NOAH
NOAH is the community land surface model (LSM) in WRF. To run NOAH offline, the offline driver code HRLDAS must be used. Otherwise, NOAH is a part of WRF and is commonly run in coupled mode . In spring 2014, an offline 2D drivercode for NOAH-MP was released, which uses the same inputfiles as HRLDAS. The difference is that Noah-MP is run with hrldas-v3.6, while Noah is run with hrldas-v3.4.1. Both models are available at /projects/metos/annefou/hrldas. Copy the relevant folder to your Abel account, copy a corresponding module file from /projects/metos/annefou/modulefiles, and load the relevant module by typing
module load hrldas/v3.4.1 # for Noah module load hrldas/v3.6 # for Noah-MP. Read about Noah-MP's improvements over Noah in Niu-2011.
If you have set up NOAH and downloaded data, please refer to this page, on how to run the model with the offline driver code HRLDAS.
Contents
Getting started with HRLDAS
A list of steps needed to run HRLDAS is given below. The default forcing data to run NOAH/HRLDAS is GFS surface data, produced by NCEP. If you'd instead like to use ERA-Interim, a list of preprocessing steps must be undertaken. These preprocessing steps are marked with an asterix.
If you use GFS data, the HRLDAS user's guide should be sufficient to get you started.
1) Get access to Abel.uio.no (ask the IT people at MetOs) 2) Install CDO on your Abel account 3) Make sure that HRLDAS works (ask the IT people at MetOs). For my user, HRLDAS is loaded as a module for every log-on. Simply type module load hrldas 4) Choose the study area, possibly using this Domain Wizard 5) Order GFS data for the appropriate domain or 5) Order GRIB data from the ECMWF webpages interim_full_daily or from Bjørg 6) Decompose GRIB files into individual time steps (4 time steps per day, 6 hour resolution) by running split_time.sh (requires CDO) 7) Calculate specific humidity (must be done when the files contain both Td, Pa and T) and Canwat (starts out as zeros). TdPa2qq.sh is the most recent script; makeQQ.sh is an obsolete version. 8) Decompose GRIB files into individual variables by running extract_var_after_split.sh 9) Create WPS files in WRF 10) Compile HRLDAS 11) Run HRLDAS pre-processing 12) Run the model HRLDAS
2) Install CDO
Download CDO to your Abel account. Then, unpack, configure, compile and install by typing
tar -xvf cdo-<version...>.tar cd cdo-<version...> ./configure make make install
Also, do print the CDO reference card.
5) Order data
If you want to download Era-Interim data on your own, see this page.
After having received the data, please have a look at the file and make sure that all variables are present. Also, check that all pressure levels are present (there should be 38 pressure levels, starting with 1,2,3,5,7,10,20 hPa... and ending with 950, 975, 1000 hPa).
grib_ls *.mars ## or g1print.exe *.mars
If you order .mars files (GRIB files) from Bjørg, she'll give you a file path to where they are saved, for instance
/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz
To access these data, log on to sverdrup or any other remote machine except Abel. You won't have acess to the files from Abel or your local computer. (Note: you may also have to include /net/vann to the start of the file path.)
ssh -YC irenebn@sverdrup cd <filepath to where I want to store the .tgz file temporarily> cp /net/vann/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz . scp ./jan_1992.tgz /mn/vann/metos-d2/irenebn/HRLDAS/data rm jan_1992.tgz
Then open a new terminal, go to the data folder and unzip the data. It is advisable to put all new data into a folder called "rawfiles", because the data can then be accessed by the scripts. If you already have your old data stored in rawfiles; change the name of this folder to a more specific name. (that is:
mv rawfiles rawfiles_1979-test mkdir rawfiles cp ./jan_1992.tgz rawfiles/ cd <filepath to where I want to store the data> tar xfz jan_1992.tgz rm jan_1992.tgz
Not until after preprocessing, transfer the file to your abel account using scp, and remove the original file. Preprocessing is described on this site: https://wiki.uio.no/mn/geo/geoit/index.php?title=Running_NOAH
9) Create WPS files
Log on to Abel and run WPS/WRF/Noah there. Make sure that the preprocessed data is transferred to Abel.
From the HRLDAS user's guide: "The strategy is to set up model grids and model input files as for a WRF simulation, but then use these files to run HRLDAS instead of WRF."
Consider creating a new folder within WPS/ for your runs. Copy the namelist.wps into this folder and make sure that the GEOGRID.TBL and METGRID.TBL files are available by adding these lines to the namelist.wps:
&GEOGRID opt_geogrid_tbl_path = '~/WRF/WPSv3.6/WPS/geogrid/',
&METGRID opt_metgrid_tbl_path = '~/WRF/WPSv3.6/WPS/metgrid/',
Examples namelists, Vtables and .TBL files are found here.
In the folder /WRF/WPS, prepare namelist.wps according to your domain. The Java program WRFPortal may help finding coordinates.
&share wrf_core = 'ARW', max_dom = 1, start_date = '2011-04-01_12:00:00','2007-05-12_00:00:00', end_date = '2011-04-01_12:00:00','2007-05-13_00:00:00' interval_seconds = 21600 !// 6 hours io_form_geogrid = 2, / &geogrid parent_id = 1, 1, parent_grid_ratio = 1, 5, i_parent_start = 1, 33, j_parent_start = 1, 55, e_we = 77, 156, e_sn = 160, 261, geog_data_res = '10m' !// 'modis_lakes+10m','modis_lakes+2m', dx = 9000, dy = 9000, map_proj = 'lambert', ref_lat = 54.3, !?? ref_lon = 10.5, !?? truelat1 = 54.3, truelat2 = 54.3, stand_lon = 10.5, geog_data_path = '/projects/metos/annefou/wrf/geog' !// '/usit/abel/u1/helenbe/WRF_GEOG/geog', !// ref_x = 38.5, !?? !// ref_y = 80., !?? / &ungrib out_format = 'WPS', prefix = 'SURF', !'ATM', !// ATM brukes med Vtable.ECATM og omvendt. / &metgrid fg_name = 'ATM','SURF' ! constants_name = './TAVGSFC' io_form_metgrid = 2, /
GEO_EM_d01.nc (WRF/WPS/geogrid.exe)
Run geogrid.exe as described at the Running WRF site.
./geogrid.exe >& geogrid.log
This generates a geo_em_d01.nc file. You may check it by typing
ncdump -h geo_em_d01.nc
To change the geo_em file, change namelist.wps and run ./geogrid.exe again.
If geogrid.exe is not visible in the folder /WRF/WPS, WPS must be compiled first.
Details can be found in the WRF user's guide.
SURF:1992-01-01_06 (WRF/WPS/ungrib.exe)
Run ungrib.exe as described at the Running WRF site.
Remember to create links to the gribfiles and the Vtables.
After having created the intermediate files, make sure that all variables are present.
util/rd_intermediate.exe SURF\:*
If not, the Vtable must be changed accordingly.
MET_EM_d01.nc (WRF/WPS/metgrid.exe)
Run metgrid.exe as described at the Running WRF site.
Remember to create links to the Metgrid.tbl table.
After having created the NetCDF files met_em*, make sure that all variables are present.
ncdump -h met_em*
If not, the Vtable must be changed accordingly.
WRFINPUT_d01.nc (WRF/WRFV3/run/real.exe)
Then navigate to the folder WRF/WRFV3/run and run Real.exe as described at the Running WRF site.
If you already have a wrfinput_d01... file, you may check it by typing
ncdump -h wrfinput_d01...
Details can be found in the WRF user's guide.
After having created the NetCDF files wrfinput, make sure that all variables are present.
ncdump -h wrfinput*
10) Compile HRLDAS
Next, run consolidate_grib.exe to consolidate the input files.
Consider creating a new folder within hrldas/hrldas-v3.4.1/HRLDAS_COLLECT_DATA. Copy the namelist.input into this folder. Also, change the GRIB1_PARAMETER_TABLE to be suitable for Era-Interim data. Use the version 128 from this page: rda.ucar.edu/docs/formats/grib/gribdoc/ecmwf_params.html
export GRIB_ROOT=~/hrldas/hrldas-v3.4.1/HRLDAS_COLLECT_DATA/GRIB_TABLES/ ./consolidate_grib.exe
After having created the NetCDF files <yyyymmddhh>LDASIN, make sure that all variables are present.
ncdump -h <yyyy>*LDASIN*
11) Run HRLDAS
Consider creating a new folder within hrldas/hrldas-v3.4.1/Run. Copy the namelist.hrldas and all tables *.TBL into this folder.
HRLDAS is run with the command
./Noah_hrldas_beta
It requires namelist.hrldas, input files on the format <YYYYMMDDHH>.LDASIN_DOMAIN<>, various .TBL files and wrfinput_d01.