Difference between revisions of "NOAH"

From mn/geo/geoit
Jump to: navigation, search
(Created page with "<br/>NOAH is the [http://www.ral.ucar.edu/research/land/technology/lsm.php community land surface model (LSM)] in WRF. To run NOAH offline, [http://www.ral.ucar.edu/research/l...")
 
m (Protected "NOAH" ([Edit=Allow only administrators] (indefinite) [Move=Allow only administrators] (indefinite)))
(No difference)

Revision as of 14:57, 17 February 2015


NOAH is the community land surface model (LSM) in WRF. To run NOAH offline, the offline driver code HRLDAS must be used. Otherwise, NOAH is a part of WRF and is commonly run in coupled mode . In spring 2014, an offline 2D drivercode for NOAH-MP was released, which uses the same inputfiles as HRLDAS. The difference is that Noah-MP is run with hrldas-v3.6, while Noah is run with hrldas-v3.4.1. Both models are available at /projects/metos/annefou/hrldas. Copy the relevant folder to your Abel account, copy a corresponding module file from /projects/metos/annefou/modulefiles, and load the relevant module by typing

module load hrldas/v3.4.1 # for Noah
module load hrldas/v3.6   # for Noah-MP. Read about Noah-MP's improvements over Noah in Niu-2011.

If you have set up NOAH and downloaded data, please refer to this page, on how to run the model with the offline driver code HRLDAS.

Getting started with HRLDAS

A list of steps needed to run HRLDAS is given below. The default forcing data to run NOAH/HRLDAS is GFS surface data, produced by NCEP. If you'd instead like to use ERA-Interim, a list of preprocessing steps must be undertaken. These preprocessing steps are marked with an asterix.

If you use GFS data, the HRLDAS user's guide should be sufficient to get you started.


1) Get access to Abel.uio.no (ask the IT people at MetOs)
2) Install CDO on your Abel account
3) Make sure that HRLDAS works (ask the IT people at MetOs). For my user, HRLDAS is loaded as a module for every log-on. Simply type module load hrldas 
4) Choose the study area, possibly using this Domain Wizard
5) Order GFS data for the appropriate domain 
or 
5) Order GRIB data from the ECMWF webpages interim_full_daily or from Bjørg 
6) Decompose GRIB files into individual time steps (4 time steps per day, 6 hour resolution) by running split_time.sh (requires CDO)
7) Calculate specific humidity (must be done when the files contain both Td, Pa and T) and Canwat (starts out as zeros). TdPa2qq.sh is the most recent script; makeQQ.sh is an obsolete version.
8) Decompose GRIB files into individual variables by running extract_var_after_split.sh
9) Create WPS files in WRF
10) Compile HRLDAS
11) Run HRLDAS pre-processing
12) Run the model HRLDAS


2) Install CDO

Download CDO to your Abel account. Then, unpack, configure, compile and install by typing

tar -xvf cdo-<version...>.tar 
cd cdo-<version...>
./configure 
make  
make install

Also, do print the CDO reference card.


Pre-processing Era-Interim data

The Era-Interim (ECMWF) data are stored in a different format than the default GFS (NCEP) data. Therefore, the following must be done to prepare the EC data: This is taken care of by the files split_time.sh, extract_var_after_split.sh and TdPa2QQ.sh. NOTE! Before running extract_var_after_split.sh, make sure that _only_ gribfiles on the hourly timescale are present in the folder (otherwise, incorrect files called "T2_19930131.g" or "T2_199301.mar" will be produced instead of " T2_1993013100".


Create this file tree on Abel

~/FILESTORE
~/FILESTORE/rawfiles
~/FILESTORE/GRIBHRS
~/FILESTORE/GRIBHRS/Constants_Initials
~/FILESTORE/GRIBHRS/QQ
~/FILESTORE/GRIBHRS/T2D
~/FILESTORE/GRIBHRS/UU
~/FILESTORE/GRIBHRS/VV
~/FILESTORE/GRIBHRS/SW
~/FILESTORE/GRIBHRS/LW
~/FILESTORE/GRIBHRS/Pa
~/FILESTORE/GRIBHRS/Precip
~/FILESTORE/GRIBHRS/DewpT
~/FILESTORE/GRIBHRS/Relhum

NOTE: it is useful to keep an empty version of this filetree, to make it easy to generate a new one for every new dataset. For instance: after making all subfolders of GRIBHRS, copy it like so:

cp -r GRIBHRS/ GRIBHRSempty/  

Whenever new data is to be decomposed, run TdPa2qq.sh and extract_var_after_split.sh as normal (the new data is then saved in GRIBHRS). Then, give GRIBHRS a more specific name, for instance

cp -r GRIBHRS/ GRIBHRS_1992/  

(#Alternatively, set the correct filepaths in TdPa2qq.sh and extract_var_after_split.sh for each run.)


5) Order data

If you want to download Era-Interim data on your own, see this page.

After having received the data, please have a look at the file and make sure that all variables are present. Also, check that all pressure levels are present (there should be 38 pressure levels, starting with 1,2,3,5,7,10,20 hPa... and ending with 950, 975, 1000 hPa).

grib_ls *.mars  ## or g1print.exe *.mars

If you order .mars files (GRIB files) from Bjørg, she'll give you a file path to where they are saved, for instance

/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz

To access these data, log on to sverdrup, ice, rossby, jern or any other remote machine except Abel. You won't have acess to the files from Abel or your local network. (Note: you may also have to include /net/vann to the start of the file path.) At last, transfer the file to your abel account using scp, and remove the file.

ssh -YC irenebn@sverdrup
cd <filepath to where I want to store the .tgz file temporarily>
cp /net/vann/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz . 
scp ./jan_1992.tgz irenebn@abel.uio.no:~/<filepath>
rm jan_1992.tgz

Then open a new terminal, log on to abel and unzip the data in an appropriate folder. It is advisable to put all new data into a folder called "rawfiles", because the data can then be accessed by the scripts. If you already have your old data stored in rawfiles; change the name of this folder to a more specific name. (that is:

mv rawfiles rawfiles_1979-test

mkdir rawfiles

scp ./jan_1992.tgz irenebn@abel.uio.no:~/FILESTORE/rawfiles)

ssh -YC irenebn@abel
cd <filepath to where I want to store the data>
tar xfz jan_1992.tgz
rm jan_1992.tgz


6) Split GRIB files into subdaily time step

split_time.sh takes a GRIB file as input and outputs one file per time step. Your GRIB files are possibly saved under FILESTORE/rawdata<years>, while the scripts should be stored under FILESTORE. To run the script, type

source ../split_time.sh
#!/bin/bash
# Script extracting 4-hourly values from a set of grib files (.mars). 
# Save this script in the same directory as the files.
# Run the script by typing split_time.sh in the terminal.
##########################################################
## First, extract LW, SW and precip from the mars files.
## This must be done before splithour
##########################################################
#_______________Write paths here__________________________
fildir='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/rawfiles1992'  
savedir='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/GRBHRS'
startfile='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/rawfiles1992/*010100.grb' 
extractdir='/usit/abel/u1/irenebn/hrldas/hrldas-v3.4.1/Utility_programs' 
#
## LW, SW og PCP are available for two hours each day; 06 og 18.
for file in $fildir/*.mars; do
  $extractdir/gribextract -c 175 -l 1,0,0 $file $fildir/tempLW_ ##${file:69:10}
  cdo splithour $fildir/tempLW_ $fildir/tempLW_hour
  cdo invertlat $fildir/tempLW_hour06.grb $savedir/tempLW_invert06.grb
  cdo invertlat $fildir/tempLW_hour18.grb $savedir/tempLW_invert18.grb
  cdo divc,10800.0 $savedir/tempLW_invert06.grb $savedir/LW/LW_${file:69:8}$'06'
  cdo divc,10800.0 $savedir/tempLW_invert18.grb $savedir/LW/LW_${file:69:8}$'18'
  #
  $extractdir/gribextract -c 169 -l 1,0,0 $file $fildir/tempSW_ ##${file:69:10} #${file:69:10} 
  cdo splithour $fildir/tempSW_ $fildir/tempSW_hour
  cdo invertlat $fildir/tempSW_hour06.grb $savedir/tempSW_invert06.grb
  cdo invertlat $fildir/tempSW_hour18.grb $savedir/tempSW_invert18.grb
  cdo divc,10800.0 $savedir/tempSW_invert06.grb $savedir/SW/SW_${file:69:8}$'06'
  cdo divc,10800.0 $savedir/tempSW_invert18.grb $savedir/SW/SW_${file:69:8}$'18'
  #
  $extractdir/gribextract -c 228 -l 1,0,0 $file $fildir/tempPCP_ ##${file:69:10} #228:Accumulaed precip in m want mm or kg(*1000) per second(/(3*60*60)) 
  cdo splithour $fildir/tempPCP_ $fildir/tempPCP_hour
  cdo invertlat $fildir/tempPCP_hour06.grb $savedir/tempPCP_invert06.grb
  cdo invertlat $fildir/tempPCP_hour18.grb $savedir/tempPCP_invert18.grb
  cdo divc,10800.0 $savedir/tempPCP_invert06.grb $savedir/PCP/PCP_${file:69:8}$'06'
  cdo divc,10800.0 $savedir/tempPCP_invert18.grb $savedir/PCP/PCP_${file:69:8}$'18'
  #
  rm $savedir/temp*  # clean up unnecessary files
  rm $fildir/temp*
done
#
##########################################################
## Then, split the files into whichever timestep is wanted.
## (splithour is preferred in my case)
##########################################################
for file in *.mars; do
   mv $file ${file%.mars}
   rm $file        # comment this out if you do need to save the raw data
done
#
for file in ma*; do
  echo $file
  cdo splithour $file $file
  rm $file     
done


7) Calculate specific humidity

To run the script, type

source ../TdPa2qq.sh

The GRIB files do not contain the needed near surface atmospheric mixing ratio (see page 12 in the HRLDAS user's guide).

Near surface atmospheric mixing ratio is approximately equal to specific humidity (according to this and this ).

The ECMWF FAQ describes how to do the calculation (see point 43).

Please note that the guide explains how to calculate the saturation specific humidity (by inputting the air temperature, T2D). We are instead interested in the specific humidity. In that case, we should input the dew point temperature instead of the air temperature. The dew point temperature indicates the absolute water vapour, whereas the temperature states how much can potentially be held in the air parcel.

This first script shows how to calculate specific humidity from the first set of parameters in the guide.

#!/bin/bash
# convert Era-Interim surface dew point temp (167) to specific humidity using 
# surface pressure (134) and formulas in EC-documentation: Look at equations 
# 7.4 and 7.5 (pages 91-92) in part IV  http://old.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf
# downloaded Td and p from EC in same file cdo showname gives var134 (pressure) and var168 (Td)
savedir='/usit/abel/u1/irenebn/FILESTORE/GRIBHRS'
#
for file in ma*; do  
   echo $file
   cdo -expr,'qsat=0.62188*(611.21*exp(17.502*(var168-273.16)/(var168-32.19)))/(var134-(1-0.62188)*(611.21*exp(17.502*((var168-273.16)/(var168-32.19)))));' $file maQQ2m
   cdo invertlat    maQQ2m     maQQinvert 
   cdo chcode,1,133 maQQinvert maQQsurf
   cdo chparam,133,133.128 maQQsurf $savedir/QQ/QQ_${file:2:10}  
done 

The next script shows how to use both sets of parameters; conditioned on the surface temperature, var167. A different set of parameters is used over ice; that is, when var167 is less than 273.16 K.

#!/bin/bash
#convert Era-Interim surface dew point temp (167) to specific humidity using surface pressure (134) and formulas in EC-documentation: Look at 
# equations 7.4 and 7.5 (pages 91-92) in part IV http://www.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf
#downloaded Td and p from EC in same file
savedir='/usit/abel/u1/irenebn/FILESTORE/1992and1993/GRIBHRS'
fildir='/usit/abel/u1/irenebn/FILESTORE/1992and1993'
for file in $fildir/ma199*.mars; do
 echo $file
 grib_filter $fildir/split-needMARSfile_134_167_168 $file # her maa det inn en -mars-fil, ikke oppdelte gribfiler. Det lages en filter.out med alle tre variablene. 
 cdo -gec,273.16 -selname,var167 filter.out aboveZero_${file:46:6}.grb  # keeps the value of days where T2 >= 0 (273.16 K), the other days are set to 0
 cdo -ltc,273.16 -selname,var167 filter.out belowZero_${file:46:6}.grb  # keeps the value of days where T2 <  0 (273.16 K), the other days are set to 0
 mv filter.out filter_${file:46:6}.out
done
for file in $fildir/ma199*.grb; do  # bruk grb for aa faa med datoene, ikke bare maaneden
   echo $file
   cdo splitday filter_${file:46:6}.out maQQdata_${file:46:6} # days (not hours yet)
   cdo -expr,'qsat=0.62188*(611.21*exp(17.502*(var168-273.16)/(var168-32.19)))/(var134-(1-0.62188)*(611.21*exp(17.502*((var168-273.16)/(var168-32.19)))));'            maQQdata_${file:46:8}.grb maQQ2m_aboveZero_${file:46:8} 
   cdo -expr,'qsat=0.62188*(611.21*exp(22.587*(var168-273.16)/(var168-(-20.7))))/(var134-(1-0.62188)*(611.21*exp(22.587*((var168-273.16)/(var168-(-20.7))))));'        maQQdata_${file:46:8}.grb maQQ2m_belowZero_${file:46:8}
# To select all field elements of ifile2 if the corresponding field element of ifile1 is greater than 0 and from ifile3 otherwise use:  
# cdo ifthenelse ifile1 ifile2 ifile3 ofile
   cdo ifthenelse aboveZero_${file:46:6}.grb maQQ2m_aboveZero_${file:46:8} maQQ2m_belowZero_${file:46:8} maQQ2m_${file:46:8}
   cdo invertlat $fildir/maQQ2m_${file:46:8}  $fildir/maQQinvert_${file:46:8} 
   cdo chcode,1,133 $fildir/maQQinvert_${file:46:8} $fildir/maQQsurf_${file:46:8}
   cdo chparam,133,133.128 $fildir/maQQsurf_${file:46:8} $fildir/maQQsCode_${file:46:8}
   cdo splithour $fildir/maQQsCode_${file:46:8} $savedir/QQ/QQ_${file:46:8}
   rm maQQ*
done
#  rm filter*.out
#  rm above*.grb
#  rm below*.grb
#for file in QQ_*; do echo $file ${file%.grb}; done # For some reason, the files are called QQ_*.grb. Run this line in the terminal to remove .grb.

Explanation: A mars file on format yyyymm.mars is filtered to extract var134 (Pressure), var167 (air temperature and var168 (dewpoint temperature). These fields are stored in filter.out.

cdo -gec,273.16 ...

With this line, all temperatures below 0 degrees Celcius get the value 0. Then filter.out is renamed into filter_yyyymm.out.

In the second loop, filter_yyyymm.out is split into days, in files called maQQdata_yyyymmdd. These daily files are used to calculate q_sat in two ways; maQQ2m_aboveZero_yyyymmdd contains the resulting values using the first set of parameters (over liquid water); maQQ2m_belowZero_yyyymmdd contains the resulting values using the second set of parameters (over ice).

cdo ifthenelse    aboveZero_yyyymm    maQQ2m_aboveZero_yyyymmdd    maQQ2m_belowZero_yyyymmdd

In this line, the numerical value of aboveZero_yyyymm determines whether the result from maQQ2m_aboveZero or maQQ2m_belowZero should be stored. (Actually, it is strange that the monthly aboveZero_yyyymm can split the daily files in a sensible way. Should check this.)

Last, invertlat, chcode, chparam and splithour prepares the files to be stored in GRIBHRS/QQ.


8) Split GRIB files into individual variables

To run the script, type

source ../extract_var_after_split.sh

extract_var_after_split.sh takes a GRIB file as input and outputs one file per variable. A list of variables in GRIB files is found here.

#!/bin/bash
fildir='/usit/abel/u1/irenebn/FILESTORE/rawfiles025IBN'
savedir='/usit/abel/u1/irenebn/FILESTORE/testGRIBHRS'
startfile='/usit/abel/u1/irenebn/FILESTORE/testGRIBHRS/*010100' 
extractdir='/usit/abel/u1/irenebn/hrldas/hrldas-v3.4.1/Utility_programs' 
#____________________________CONSTANTS & INITIALS____________________________
/$extractdir/gribextract -c 129 -l 1,0,0 $startfile $savedir/Constants_Initials/temp_SFC_ELEVATION 
# Is in m^3/s^s, need to divide by g, 9.80665, see EC FAQ
# cdo divc,9.80665 $savedir/Constants_Initials/temp_SFC_ELEVATION $savedir/Constants_Initials/SFC_ELEVATION
/$extractdir/gribextract -c 172 -l 1,0,0 $startfile $savedir/Constants_Initials/LANDSEA 
# May need to Convert the ECMWF LANDSEA mask from a fraction to a flag see rrpf.F eller noe i WPS/ungrib..
/$extractdir/gribextract -c 141 -l 1,0,0 $startfile $savedir/Constants_Initials/WEASD 
#snow water eq (in m, should be kg/m^2= maa gange m 1000, water density? som i rrpf.F eller noe) SWE (kg/m2) = snow depth (m) x snow density (kg/m3), 
# SWE (m) = snow depth (m) x snow density (kg/m3) / water density (kg/m3)
# cdo mulc,1000.0 $savedir/Constants_Initials/WEASD $savedir/Constants_Initials/WEASD
##echo $startfile $savedir/Constants_Initials/SOIL_M_000-010
$extractdir/gribextract -c 39 -l 112,0,7 $startfile $savedir/Constants_Initials/SOIL_M_000-010
$extractdir/gribextract -c 40 -l 112,7,28 $startfile $savedir/Constants_Initials/SOIL_M_010-040
$extractdir/gribextract -c 41 -l 112,28,100 $startfile $savedir/Constants_Initials/SOIL_M_040-100
$extractdir/gribextract -c 42 -l 112,100,255 $startfile $savedir/Constants_Initials/SOIL_M_100-200
$extractdir/gribextract -c 139 -l 112,0,7 $startfile $savedir/Constants_Initials/SOIL_T_000-010
$extractdir/gribextract -c 170 -l 112,7,28 $startfile $savedir/Constants_Initials/SOIL_T_010-040
$extractdir/gribextract -c 183 -l 112,28,100 $startfile $savedir/Constants_Initials/SOIL_T_040-100
$extractdir/gribextract -c 236 -l 112,100,255 $startfile $savedir/Constants_Initials/SOIL_T_100-200
$extractdir/gribextract -c 235 -l 1,0,0 $startfile $savedir/Constants_Initials/SKINTEMP
cdo mulc,0.0 $savedir/Constants_Initials/WEASD $savedir/Constants_Initials/CAN
cdo chcode,141,44 $savedir/Constants_Initials/CAN $savedir/Constants_Initials/CANW
cdo chparam,44,44.128 $savedir/Constants_Initials/CANW $savedir/Constants_Initials/CANWAT
  #
  #________________________________VARIABLES________________________________
  $extractdir/gribextract -c 165 -l 1,0,0 $file $savedir/tempUU_ ## ${file:69:10}
  cdo invertlat $fildir/tempUU_00.grb $savedir/tempUU_invert00.grb
  cdo invertlat $fildir/tempUU_06.grb $savedir/tempUU_invert06.gr
  cdo invertlat $fildir/tempUU_12.grb $savedir/tempUU_invert12.grb
  cdo invertlat $fildir/tempUU_18.grb $savedir/tempUU_invert18.grb
  cdo mulc,1.0 $savedir/tempUU_00.grb $savedir/UU/UU_${file:69:10}
  cdo mulc,1.0 $savedir/tempUU_06.grb $savedir/UU/UU_${file:69:10}
  cdo mulc,1.0 $savedir/tempUU_12.grb $savedir/UU/UU_${file:69:10}
  cdo mulc,1.0 $savedir/tempUU_18.grb $savedir/UU/UU_${file:69:10}
  #
  $extractdir/gribextract -c 166 -l 1,0,0 $file $savedir/tempVV_ ##${file:69:10}
  cdo invertlat $fildir/tempVV_00.grb $savedir/tempVV_invert00.grb
  cdo invertlat $fildir/tempVV_06.grb $savedir/tempVV_invert06.gr
  cdo invertlat $fildir/tempVV_12.grb $savedir/tempVV_invert12.grb
  cdo invertlat $fildir/tempVV_18.grb $savedir/tempVV_invert18.grb
  cdo mulc,1.0 $savedir/tempVV_00.grb $savedir/VV/VV_${file:69:10}
  cdo mulc,1.0 $savedir/tempVV_06.grb $savedir/VV/VV_${file:69:10}
  cdo mulc,1.0 $savedir/tempVV_12.grb $savedir/VV/VV_${file:69:10}
  cdo mulc,1.0 $savedir/tempVV_18.grb $savedir/VV/VV_${file:69:10}
  #
  $extractdir/gribextract -c 167 -l 1,0,0 $file $savedir/tempT2_ ##${file:69:10}
  cdo invertlat $fildir/tempT2_00.grb $savedir/tempT2_invert00.grb
  cdo invertlat $fildir/tempT2_06.grb $savedir/tempT2_invert06.gr
  cdo invertlat $fildir/tempT2_12.grb $savedir/tempT2_invert12.grb
  cdo invertlat $fildir/tempT2_18.grb $savedir/tempT2_invert18.grb
  cdo mulc,1.0 $savedir/tempT2_00.grb $savedir/T2D/T2_${file:69:10}     #nT2/T2_${file:69:10}
  cdo mulc,1.0 $savedir/tempT2_06.grb $savedir/T2D/T2_${file:69:10}
  cdo mulc,1.0 $savedir/tempT2_12.grb $savedir/T2D/T2_${file:69:10}
  cdo mulc,1.0 $savedir/tempT2_18.grb $savedir/T2D/T2_${file:69:10}
  #
  $extractdir/gribextract -c 134 -l 1,0,0 $file $savedir/tempPa_ ##${file:69:10}
  cdo invertlat $fildir/tempPa_00.grb $savedir/tempPa_invert00.grb
  cdo invertlat $fildir/tempPa_06.grb $savedir/tempPa_invert06.gr
  cdo invertlat $fildir/tempPa_12.grb $savedir/tempPa_invert12.grb
  cdo invertlat $fildir/tempPa_18.grb $savedir/tempPa_invert18.grb
  cdo mulc,1.0 $savedir/tempPa_00.grb $savedir/Pa/Pa_${file:69:10}  
  cdo mulc,1.0 $savedir/tempPa_06.grb $savedir/Pa/Pa_${file:69:10} 
  cdo mulc,1.0 $savedir/tempPa_12.grb $savedir/Pa/Pa_${file:69:10}  
  cdo mulc,1.0 $savedir/tempPa_18.grb $savedir/Pa/Pa_${file:69:10}   
  #
  rm $savedir/temp*
done


A similar way of doing it is given in split_time.sh, but I am not using it because I need to calculate QQ in between.

for file in 1979*; do
   echo $file
   $cdo selcode,167 $file T2_$file
   $cdo selcode,165 $file UU_$file
   $cdo selcode,166 $file VV_$file
   $cdo selcode,39 $file SM_000_007_$file
   $cdo selcode,40 $file SM_007_028_$file
   $cdo selcode,41 $file SM_028_100$file
   $cdo selcode,42 $file SM_100-250$file
   $cdo selcode,139 $file ST_000_007$file
   $cdo selcode,170 $file ST_007_028$file
   $cdo selcode,183 $file ST_028_100$file
   $cdo selcode,236 $file ST_100_250$file
   $cdo selcode,31 $file SIce_$file
   $cdo selcode,172 $file LandS_$file
   $cdo selcode,129 $file tempZ_$file
   #Is in m^3/s^3, need to divide by g, 9.80665, see EC FAQ
   $cdo divc,9.80665 tempZ_$file Z_$file
   $cdo selcode,235 $file SkinT_$file 
   $cdo selcode,228 $file tempTP_$file
   $cdo divc,10800.0 tempTP_$file TP_$file
   $cdo selcode,169 $file tempSW_$file
   $cdo divc,10800.0 tempSW_$file SW_$file      #Average over previous 3 hours 1/(3*60*60)
   $cdo selcode,175 $file tempLW_$file
   $cdo divc,10800.0 tempLW_$file LW_$file
   $cdo selcode,141 $file tempSD_$file
   #snow water eq (in m, should be kg/m^2)
   #SWE (kg/m2) = snow depth (m) x snow density (kg/m3), SWE (m) = snow depth (m) x snow density (kg/m3) / water density (kg/m3)
   $cdo mulc,1000.0 tempSD_$file SD_$file 
   $cdo selcode,168 $file Dew_$file
done


Invert latitude

The Era-Interim files and the GFS data have different coordinates. In the Era-Interim files, origo (i=0, j=0) are found in the upper left corner, whereas in GFS, origo is in the lower left corner. This is fixed (in TdPa2QQ.sh and extract_var_after_split.sh) by writing

   $cdo invertlat $file ${file:(-10)}   # :(-10)} removes ma from filename

Please note that if the GRIB files contain several grids (atmospheric as well as ground), this inversion must be done after having split the files into variables. We are interested in the ground grid (but as long as no atmospheric fields are contained in the rawdata, invertlat can be done in split_time.sh).


9) Create WPS files

From the HRLDAS user's guide: "The strategy is to set up model grids and model input files as for a WRF simulation, but then use these files to run HRLDAS instead of WRF."

Consider creating a new folder within WPS/ for your runs. Copy the namelist.wps into this folder and make sure that the GEOGRID.TBL and METGRID.TBL files are available by adding these lines to the namelist.wps:

&GEOGRID
opt_geogrid_tbl_path = '~/WRF/WPSv3.6/WPS/geogrid/',
&METGRID
opt_metgrid_tbl_path = '~/WRF/WPSv3.6/WPS/metgrid/',


Examples namelists, Vtables and .TBL files are found here.

In the folder /WRF/WPS, prepare namelist.wps according to your domain. The Java program WRFPortal may help finding coordinates.

&share
wrf_core = 'ARW',
max_dom = 1,
start_date = '2011-04-01_12:00:00','2007-05-12_00:00:00',
end_date   = '2011-04-01_12:00:00','2007-05-13_00:00:00'
interval_seconds = 21600 !// 6 hours
io_form_geogrid = 2,
/
&geogrid
parent_id         =    1,   1,  
parent_grid_ratio =    1,   5, 
i_parent_start    =    1,   33, 
j_parent_start    =    1,   55,
e_we              =   77,   156,
e_sn              =  160,   261,
geog_data_res     = '10m'        !// 'modis_lakes+10m','modis_lakes+2m', 
dx = 9000,
dy = 9000,
map_proj = 'lambert',
ref_lat   =  54.3,  !??
ref_lon   =  10.5,  !??
truelat1  =  54.3,
truelat2  =  54.3,
stand_lon =  10.5,
geog_data_path = '/projects/metos/annefou/wrf/geog' 
!// '/usit/abel/u1/helenbe/WRF_GEOG/geog',
!// ref_x = 38.5, !??
!// ref_y = 80.,  !??
/
&ungrib
out_format = 'WPS',
prefix = 'SURF', !'ATM', !// ATM brukes med Vtable.ECATM og omvendt.
/
&metgrid
fg_name = 'ATM','SURF'
! constants_name  = './TAVGSFC'
io_form_metgrid = 2, 
/


GEO_EM_d01.nc (WRF/WPS/geogrid.exe)

Run geogrid.exe as described at the Running WRF site.

./geogrid.exe >& geogrid.log 

This generates a geo_em_d01.nc file. You may check it by typing

ncdump -h geo_em_d01.nc

To change the geo_em file, change namelist.wps and run ./geogrid.exe again.

If geogrid.exe is not visible in the folder /WRF/WPS, WPS must be compiled first.

Details can be found in the WRF user's guide.


SURF:1992-01-01_06 (WRF/WPS/ungrib.exe)

Run ungrib.exe as described at the Running WRF site.

Remember to create links to the gribfiles and the Vtables.

After having created the intermediate files, make sure that all variables are present.

util/rd_intermediate.exe SURF\:*

If not, the Vtable must be changed accordingly.

MET_EM_d01.nc (WRF/WPS/metgrid.exe)

Run metgrid.exe as described at the Running WRF site.

Remember to create links to the Metgrid.tbl table.

After having created the NetCDF files met_em*, make sure that all variables are present.

ncdump -h met_em*

If not, the Vtable must be changed accordingly.


WRFINPUT_d01.nc (WRF/WRFV3/run/real.exe)

Then navigate to the folder WRF/WRFV3/run and run Real.exe as described at the Running WRF site.

If you already have a wrfinput_d01... file, you may check it by typing

ncdump -h wrfinput_d01...

Details can be found in the WRF user's guide.

After having created the NetCDF files wrfinput, make sure that all variables are present.

ncdump -h wrfinput*


10) Compile HRLDAS

Next, run consolidate_grib.exe to consolidate the input files.

Consider creating a new folder within hrldas/hrldas-v3.4.1/HRLDAS_COLLECT_DATA. Copy the namelist.input into this folder. Also, change the GRIB1_PARAMETER_TABLE to be suitable for Era-Interim data. Use the version 128 from this page: rda.ucar.edu/docs/formats/grib/gribdoc/ecmwf_params.html

export GRIB_ROOT=~/hrldas/hrldas-v3.4.1/HRLDAS_COLLECT_DATA/GRIB_TABLES/
./consolidate_grib.exe

After having created the NetCDF files <yyyymmddhh>LDASIN, make sure that all variables are present.

ncdump -h <yyyy>*LDASIN*

11) Run HRLDAS

Consider creating a new folder within hrldas/hrldas-v3.4.1/Run. Copy the namelist.hrldas and all tables *.TBL into this folder.

HRLDAS is run with the command

./Noah_hrldas_beta 

It requires namelist.hrldas, input files on the format <YYYYMMDDHH>.LDASIN_DOMAIN<>, various .TBL files and wrfinput_d01.