|
|
Line 42: |
Line 42: |
| | | |
| Also, do print the [https://code.zmaw.de/projects/cdo/embedded/1.6.3/cdo_refcard.pdf CDO reference card]. | | Also, do print the [https://code.zmaw.de/projects/cdo/embedded/1.6.3/cdo_refcard.pdf CDO reference card]. |
− |
| |
− |
| |
− |
| |
− | == Pre-processing Era-Interim data ==
| |
− |
| |
− | The Era-Interim (ECMWF) data are stored in a different format than the default GFS (NCEP) data. Therefore, the following must be done to prepare the EC data: This is taken care of by the files split_time.sh, extract_var_after_split.sh and TdPa2QQ.sh. NOTE! Before running extract_var_after_split.sh, make sure that _only_ gribfiles on the hourly timescale are present in the folder (otherwise, incorrect files called "T2_19930131.g" or "T2_199301.mar" will be produced instead of " T2_1993013100".
| |
− |
| |
− |
| |
− |
| |
− | === Create this file tree on Abel ===
| |
− |
| |
− | ~/FILESTORE
| |
− | ~/FILESTORE/rawfiles
| |
− | ~/FILESTORE/GRIBHRS
| |
− | ~/FILESTORE/GRIBHRS/Constants_Initials
| |
− | ~/FILESTORE/GRIBHRS/QQ
| |
− | ~/FILESTORE/GRIBHRS/T2D
| |
− | ~/FILESTORE/GRIBHRS/UU
| |
− | ~/FILESTORE/GRIBHRS/VV
| |
− | ~/FILESTORE/GRIBHRS/SW
| |
− | ~/FILESTORE/GRIBHRS/LW
| |
− | ~/FILESTORE/GRIBHRS/Pa
| |
− | ~/FILESTORE/GRIBHRS/Precip
| |
− | ~/FILESTORE/GRIBHRS/DewpT
| |
− | ~/FILESTORE/GRIBHRS/Relhum
| |
− |
| |
− | NOTE: it is useful to keep an empty version of this filetree, to make it easy to generate a new one for every new dataset. For instance: after making all subfolders of GRIBHRS, copy it like so:
| |
− |
| |
− | cp -r GRIBHRS/ GRIBHRSempty/
| |
− |
| |
− | Whenever new data is to be decomposed, run TdPa2qq.sh and extract_var_after_split.sh as normal (the new data is then saved in GRIBHRS). Then, give GRIBHRS a more specific name, for instance
| |
− |
| |
− | cp -r GRIBHRS/ GRIBHRS_1992/
| |
− |
| |
− | (#Alternatively, set the correct filepaths in TdPa2qq.sh and extract_var_after_split.sh for each run.)
| |
− |
| |
| | | |
| | | |
Line 92: |
Line 56: |
| /mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz | | /mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz |
| | | |
− | To access these data, log on to sverdrup, ice, rossby, jern or any other remote machine except Abel. You won't have acess to the files from Abel or your local network. (Note: you may also have to include /net/vann to the start of the file path.) At last, transfer the file to your abel account using scp, and remove the file. | + | To access these data, log on to sverdrup or any other remote machine except Abel. You won't have acess to the files from Abel or your local computer. (Note: you may also have to include /net/vann to the start of the file path.) |
| | | |
| ssh -YC irenebn@sverdrup | | ssh -YC irenebn@sverdrup |
| cd <filepath to where I want to store the .tgz file temporarily> | | cd <filepath to where I want to store the .tgz file temporarily> |
| cp /net/vann/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz . | | cp /net/vann/mn/vann/metos-d3/bjoergr/mars/irene/jan_1992.tgz . |
− | scp ./jan_1992.tgz irenebn@abel.uio.no:~/<filepath> | + | scp ./jan_1992.tgz /mn/vann/metos-d2/irenebn/HRLDAS/data |
| rm jan_1992.tgz | | rm jan_1992.tgz |
| | | |
− | Then open a new terminal, log on to abel and unzip the data in an appropriate folder. It is advisable to put all new data into a folder called "rawfiles", because the data can then be accessed by the scripts. If you already have your old data stored in rawfiles; change the name of this folder to a more specific name. (that is: | + | Then open a new terminal, go to the data folder and unzip the data. It is advisable to put all new data into a folder called "rawfiles", because the data can then be accessed by the scripts. If you already have your old data stored in rawfiles; change the name of this folder to a more specific name. (that is: |
− | | |
− | mv rawfiles rawfiles_1979-test
| |
− | | |
− | mkdir rawfiles
| |
− | | |
− | scp ./jan_1992.tgz irenebn@abel.uio.no:~/FILESTORE/rawfiles)
| |
| | | |
− | ssh -YC irenebn@abel | + | mv rawfiles rawfiles_1979-test |
| + | mkdir rawfiles |
| + | cp ./jan_1992.tgz rawfiles/ |
| cd <filepath to where I want to store the data> | | cd <filepath to where I want to store the data> |
| tar xfz jan_1992.tgz | | tar xfz jan_1992.tgz |
| rm jan_1992.tgz | | rm jan_1992.tgz |
| | | |
− | | + | Not until after preprocessing, transfer the file to your abel account using scp, and remove the original file. |
− | | + | Preprocessing is described on this site: [https://wiki.uio.no/mn/geo/geoit/index.php?title=Running_NOAH https://wiki.uio.no/mn/geo/geoit/index.php?title=Running_NOAH] |
− | == 6) Split GRIB files into subdaily time step ==
| |
− | | |
− | split_time.sh takes a GRIB file as input and outputs one file per time step. Your GRIB files are possibly saved under FILESTORE/rawdata<years>, while the scripts should be stored under FILESTORE. To run the script, type
| |
− | | |
− | source ../split_time.sh
| |
− | | |
− | #!/bin/bash
| |
− | # Script extracting 4-hourly values from a set of grib files (.mars).
| |
− | # Save this script in the same directory as the files.
| |
− | # Run the script by typing split_time.sh in the terminal.
| |
− | ##########################################################
| |
− | ## First, extract LW, SW and precip from the mars files.
| |
− | ## This must be done before splithour
| |
− | ##########################################################
| |
− | #_______________Write paths here__________________________
| |
− | fildir='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/rawfiles1992'
| |
− | savedir='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/GRBHRS'
| |
− | startfile='/usit/abel/u1/irenebn/FILESTORE/TestInvertlatTilSlutt/rawfiles1992/*010100.grb'
| |
− | extractdir='/usit/abel/u1/irenebn/hrldas/hrldas-v3.4.1/Utility_programs'
| |
− | #
| |
− | ## LW, SW og PCP are available for two hours each day; 06 og 18.
| |
− | for file in $fildir/*.mars; do
| |
− | $extractdir/gribextract -c 175 -l 1,0,0 $file $fildir/tempLW_ ##${file:69:10}
| |
− | cdo splithour $fildir/tempLW_ $fildir/tempLW_hour
| |
− | cdo invertlat $fildir/tempLW_hour06.grb $savedir/tempLW_invert06.grb
| |
− | cdo invertlat $fildir/tempLW_hour18.grb $savedir/tempLW_invert18.grb
| |
− | cdo divc,10800.0 $savedir/tempLW_invert06.grb $savedir/LW/LW_${file:69:8}$'06'
| |
− | cdo divc,10800.0 $savedir/tempLW_invert18.grb $savedir/LW/LW_${file:69:8}$'18'
| |
− | #
| |
− | $extractdir/gribextract -c 169 -l 1,0,0 $file $fildir/tempSW_ ##${file:69:10} #${file:69:10}
| |
− | cdo splithour $fildir/tempSW_ $fildir/tempSW_hour
| |
− | cdo invertlat $fildir/tempSW_hour06.grb $savedir/tempSW_invert06.grb
| |
− | cdo invertlat $fildir/tempSW_hour18.grb $savedir/tempSW_invert18.grb
| |
− | cdo divc,10800.0 $savedir/tempSW_invert06.grb $savedir/SW/SW_${file:69:8}$'06'
| |
− | cdo divc,10800.0 $savedir/tempSW_invert18.grb $savedir/SW/SW_${file:69:8}$'18'
| |
− | #
| |
− | $extractdir/gribextract -c 228 -l 1,0,0 $file $fildir/tempPCP_ ##${file:69:10} #228:Accumulaed precip in m want mm or kg(*1000) per second(/(3*60*60))
| |
− | cdo splithour $fildir/tempPCP_ $fildir/tempPCP_hour
| |
− | cdo invertlat $fildir/tempPCP_hour06.grb $savedir/tempPCP_invert06.grb
| |
− | cdo invertlat $fildir/tempPCP_hour18.grb $savedir/tempPCP_invert18.grb
| |
− | cdo divc,10800.0 $savedir/tempPCP_invert06.grb $savedir/PCP/PCP_${file:69:8}$'06'
| |
− | cdo divc,10800.0 $savedir/tempPCP_invert18.grb $savedir/PCP/PCP_${file:69:8}$'18'
| |
− | #
| |
− | rm $savedir/temp* # clean up unnecessary files
| |
− | rm $fildir/temp*
| |
− | done
| |
− | #
| |
− | ##########################################################
| |
− | ## Then, split the files into whichever timestep is wanted.
| |
− | ## (splithour is preferred in my case)
| |
− | ##########################################################
| |
− | for file in *.mars; do
| |
− | mv $file ${file%.mars}
| |
− | rm $file # comment this out if you do need to save the raw data
| |
− | done
| |
− | #
| |
− | for file in ma*; do
| |
− | echo $file
| |
− | cdo splithour $file $file
| |
− | rm $file
| |
− | done
| |
− | | |
− | | |
− | | |
− | == 7) Calculate specific humidity ==
| |
− | | |
− | To run the script, type
| |
− | | |
− | source ../TdPa2qq.sh
| |
− | | |
− | The GRIB files do not contain the needed near surface atmospheric mixing ratio (see [http://www.ral.ucar.edu/research/land/technology/lsm/HRLDAS_USERS_GUIDE_34.pdf page 12 in the HRLDAS user's guide]).
| |
− | | |
− | Near surface atmospheric mixing ratio is approximately equal to specific humidity (according to [http://www.geog.ucsb.edu/~joel/g266_s10/lecture_notes/chapt03/oh10_3_01/oh10_3_01.html this] [http://www.google.no/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&ved=0CEAQFjAC&url=http%3A%2F%2Fwww.conservationphysics.org%2Fteabag%2Fah_mr.php&ei=YZlCU__MNYrFtAapuoDoDg&usg=AFQjCNFZyo5GlfTcoBEtpl0XgyuvsR6zxg&bvm=bv.64367178,d.Yms&cad=rja and this] ).
| |
− | | |
− | [http://www.ecmwf.int/products/data/archive/data_faq.html The ECMWF FAQ] describes how to do the calculation (see [http://www.ecmwf.int/products/data/archive/data_faq.html#relative_humidity point 43]).
| |
− | | |
− | Please note that the guide explains how to calculate the saturation specific humidity (by inputting the air temperature, T2D). We are instead interested in the specific humidity. In that case, we should input the dew point temperature instead of the air temperature. The dew point temperature indicates the '''absolute''' water vapour, whereas the temperature states how much can potentially be held in the air parcel.
| |
− | | |
− | This first script shows how to calculate specific humidity from the first set of parameters in the [http://old.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf guide].
| |
− | | |
− | #!/bin/bash
| |
− | # convert Era-Interim surface dew point temp (167) to specific humidity using
| |
− | # surface pressure (134) and formulas in EC-documentation: Look at equations
| |
− | # 7.4 and 7.5 (pages 91-92) in part IV [http://old.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf http://old.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf]
| |
− | # downloaded Td and p from EC in same file cdo showname gives var134 (pressure) and var168 (Td)
| |
− | savedir='/usit/abel/u1/irenebn/FILESTORE/GRIBHRS'
| |
− | #
| |
− | for file in ma*; do
| |
− | echo $file
| |
− | cdo -expr,'qsat=0.62188*(611.21*exp(17.502*(var168-273.16)/(var168-32.19)))/(var134-(1-0.62188)*(611.21*exp(17.502*((var168-273.16)/(var168-32.19)))));' $file maQQ2m
| |
− | cdo invertlat maQQ2m maQQinvert
| |
− | cdo chcode,1,133 maQQinvert maQQsurf
| |
− | cdo chparam,133,133.128 maQQsurf $savedir/QQ/QQ_${file:2:10}
| |
− | done
| |
− | | |
− | The next script shows how to use both sets of parameters; conditioned on the surface temperature, var167. A different set of parameters is used over ice; that is, when var167 is less than 273.16 K.
| |
− | | |
− | #!/bin/bash
| |
− | #convert Era-Interim surface dew point temp (167) to specific humidity using surface pressure (134) and formulas in EC-documentation: Look at
| |
− | # equations 7.4 and 7.5 (pages 91-92) in part IV [http://www.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf http://www.ecmwf.int/research/ifsdocs/CY36r1/PHYSICS/IFSPart4.pdf]
| |
− | #downloaded Td and p from EC in same file
| |
− | savedir='/usit/abel/u1/irenebn/FILESTORE/1992and1993/GRIBHRS'
| |
− | fildir='/usit/abel/u1/irenebn/FILESTORE/1992and1993'
| |
− | for file in $fildir/ma199*.mars; do
| |
− | echo $file
| |
− | grib_filter $fildir/split-needMARSfile_134_167_168 $file # her maa det inn en -mars-fil, ikke oppdelte gribfiler. Det lages en filter.out med alle tre variablene.
| |
− | cdo -gec,273.16 -selname,var167 filter.out aboveZero_${file:46:6}.grb # keeps the value of days where T2 >= 0 (273.16 K), the other days are set to 0
| |
− | cdo -ltc,273.16 -selname,var167 filter.out belowZero_${file:46:6}.grb # keeps the value of days where T2 < 0 (273.16 K), the other days are set to 0
| |
− | mv filter.out filter_${file:46:6}.out
| |
− | done
| |
− | for file in $fildir/ma199*.grb; do # bruk grb for aa faa med datoene, ikke bare maaneden
| |
− | echo $file
| |
− | cdo splitday filter_${file:46:6}.out maQQdata_${file:46:6} # days (not hours yet)
| |
− | cdo -expr,'qsat=0.62188*(611.21*exp(17.502*(var168-273.16)/(var168-32.19)))/(var134-(1-0.62188)*(611.21*exp(17.502*((var168-273.16)/(var168-32.19)))));' maQQdata_${file:46:8}.grb maQQ2m_aboveZero_${file:46:8}
| |
− | cdo -expr,'qsat=0.62188*(611.21*exp(22.587*(var168-273.16)/(var168-(-20.7))))/(var134-(1-0.62188)*(611.21*exp(22.587*((var168-273.16)/(var168-(-20.7))))));' maQQdata_${file:46:8}.grb maQQ2m_belowZero_${file:46:8}
| |
− | # To select all field elements of ifile2 if the corresponding field element of ifile1 is greater than 0 and from ifile3 otherwise use:
| |
− | # cdo ifthenelse ifile1 ifile2 ifile3 ofile
| |
− | cdo ifthenelse aboveZero_${file:46:6}.grb maQQ2m_aboveZero_${file:46:8} maQQ2m_belowZero_${file:46:8} maQQ2m_${file:46:8}
| |
− | cdo invertlat $fildir/maQQ2m_${file:46:8} $fildir/maQQinvert_${file:46:8}
| |
− | cdo chcode,1,133 $fildir/maQQinvert_${file:46:8} $fildir/maQQsurf_${file:46:8}
| |
− | cdo chparam,133,133.128 $fildir/maQQsurf_${file:46:8} $fildir/maQQsCode_${file:46:8}
| |
− | cdo splithour $fildir/maQQsCode_${file:46:8} $savedir/QQ/QQ_${file:46:8}
| |
− | rm maQQ*
| |
− | done
| |
− | # rm filter*.out
| |
− | # rm above*.grb
| |
− | # rm below*.grb
| |
− | #for file in QQ_*; do echo $file ${file%.grb}; done # For some reason, the files are called QQ_*.grb. Run this line in the terminal to remove .grb.
| |
− | | |
− | Explanation: A mars file on format yyyymm.mars is filtered to extract var134 (Pressure), var167 (air temperature and var168 (dewpoint temperature). These fields are stored in filter.out.
| |
− | | |
− | cdo -gec,273.16 ...
| |
− | | |
− | With this line, all temperatures below 0 degrees Celcius get the value 0. Then filter.out is renamed into filter_yyyymm.out.
| |
− | | |
− | In the second loop, filter_yyyymm.out is split into days, in files called maQQdata_yyyymmdd. These daily files are used to calculate q_sat in two ways; maQQ2m_aboveZero_yyyymmdd contains the resulting values using the first set of parameters (over liquid water); maQQ2m_belowZero_yyyymmdd contains the resulting values using the second set of parameters (over ice).
| |
− | | |
− | cdo ifthenelse aboveZero_yyyymm maQQ2m_aboveZero_yyyymmdd maQQ2m_belowZero_yyyymmdd
| |
− | | |
− | In this line, the numerical value of aboveZero_yyyymm determines whether the result from maQQ2m_aboveZero or maQQ2m_belowZero should be stored. (Actually, it is strange that the monthly aboveZero_yyyymm can split the daily files in a sensible way. Should check this.)
| |
− | | |
− | Last, invertlat, chcode, chparam and splithour prepares the files to be stored in GRIBHRS/QQ.
| |
− | | |
− | | |
− | | |
− | == 8) Split GRIB files into individual variables ==
| |
− | | |
− | To run the script, type
| |
− | | |
− | source ../extract_var_after_split.sh
| |
− | | |
− | extract_var_after_split.sh takes a GRIB file as input and outputs one file per variable. A list of [http://www.ecmwf.int/publications/manuals/d/gribapi/param/filter=mars/order=paramId/order_type=asc/p=-1/table=all/ variables in GRIB files is found here.]
| |
− | | |
− | #!/bin/bash
| |
− | fildir='/usit/abel/u1/irenebn/FILESTORE/rawfiles025IBN'
| |
− | savedir='/usit/abel/u1/irenebn/FILESTORE/testGRIBHRS'
| |
− | startfile='/usit/abel/u1/irenebn/FILESTORE/testGRIBHRS/*010100'
| |
− | extractdir='/usit/abel/u1/irenebn/hrldas/hrldas-v3.4.1/Utility_programs'
| |
− | #____________________________CONSTANTS & INITIALS____________________________
| |
− | /$extractdir/gribextract -c 129 -l 1,0,0 $startfile $savedir/Constants_Initials/temp_SFC_ELEVATION
| |
− | # Is in m^3/s^s, need to divide by g, 9.80665, see EC FAQ
| |
− | # cdo divc,9.80665 $savedir/Constants_Initials/temp_SFC_ELEVATION $savedir/Constants_Initials/SFC_ELEVATION
| |
− | /$extractdir/gribextract -c 172 -l 1,0,0 $startfile $savedir/Constants_Initials/LANDSEA
| |
− | # May need to Convert the ECMWF LANDSEA mask from a fraction to a flag see rrpf.F eller noe i WPS/ungrib..
| |
− | /$extractdir/gribextract -c 141 -l 1,0,0 $startfile $savedir/Constants_Initials/WEASD
| |
− | #snow water eq (in m, should be kg/m^2= maa gange m 1000, water density? som i rrpf.F eller noe) SWE (kg/m2) = snow depth (m) x snow density (kg/m3),
| |
− | # SWE (m) = snow depth (m) x snow density (kg/m3) / water density (kg/m3)
| |
− | # cdo mulc,1000.0 $savedir/Constants_Initials/WEASD $savedir/Constants_Initials/WEASD
| |
− | ##echo $startfile $savedir/Constants_Initials/SOIL_M_000-010
| |
− | $extractdir/gribextract -c 39 -l 112,0,7 $startfile $savedir/Constants_Initials/SOIL_M_000-010
| |
− | $extractdir/gribextract -c 40 -l 112,7,28 $startfile $savedir/Constants_Initials/SOIL_M_010-040
| |
− | $extractdir/gribextract -c 41 -l 112,28,100 $startfile $savedir/Constants_Initials/SOIL_M_040-100
| |
− | $extractdir/gribextract -c 42 -l 112,100,255 $startfile $savedir/Constants_Initials/SOIL_M_100-200
| |
− | $extractdir/gribextract -c 139 -l 112,0,7 $startfile $savedir/Constants_Initials/SOIL_T_000-010
| |
− | $extractdir/gribextract -c 170 -l 112,7,28 $startfile $savedir/Constants_Initials/SOIL_T_010-040
| |
− | $extractdir/gribextract -c 183 -l 112,28,100 $startfile $savedir/Constants_Initials/SOIL_T_040-100
| |
− | $extractdir/gribextract -c 236 -l 112,100,255 $startfile $savedir/Constants_Initials/SOIL_T_100-200
| |
− | $extractdir/gribextract -c 235 -l 1,0,0 $startfile $savedir/Constants_Initials/SKINTEMP
| |
− | cdo mulc,0.0 $savedir/Constants_Initials/WEASD $savedir/Constants_Initials/CAN
| |
− | cdo chcode,141,44 $savedir/Constants_Initials/CAN $savedir/Constants_Initials/CANW
| |
− | cdo chparam,44,44.128 $savedir/Constants_Initials/CANW $savedir/Constants_Initials/CANWAT
| |
− | #
| |
− | #________________________________VARIABLES________________________________
| |
− | $extractdir/gribextract -c 165 -l 1,0,0 $file $savedir/tempUU_ ## ${file:69:10}
| |
− | cdo invertlat $fildir/tempUU_00.grb $savedir/tempUU_invert00.grb
| |
− | cdo invertlat $fildir/tempUU_06.grb $savedir/tempUU_invert06.gr
| |
− | cdo invertlat $fildir/tempUU_12.grb $savedir/tempUU_invert12.grb
| |
− | cdo invertlat $fildir/tempUU_18.grb $savedir/tempUU_invert18.grb
| |
− | cdo mulc,1.0 $savedir/tempUU_00.grb $savedir/UU/UU_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempUU_06.grb $savedir/UU/UU_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempUU_12.grb $savedir/UU/UU_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempUU_18.grb $savedir/UU/UU_${file:69:10}
| |
− | #
| |
− | $extractdir/gribextract -c 166 -l 1,0,0 $file $savedir/tempVV_ ##${file:69:10}
| |
− | cdo invertlat $fildir/tempVV_00.grb $savedir/tempVV_invert00.grb
| |
− | cdo invertlat $fildir/tempVV_06.grb $savedir/tempVV_invert06.gr
| |
− | cdo invertlat $fildir/tempVV_12.grb $savedir/tempVV_invert12.grb
| |
− | cdo invertlat $fildir/tempVV_18.grb $savedir/tempVV_invert18.grb
| |
− | cdo mulc,1.0 $savedir/tempVV_00.grb $savedir/VV/VV_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempVV_06.grb $savedir/VV/VV_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempVV_12.grb $savedir/VV/VV_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempVV_18.grb $savedir/VV/VV_${file:69:10}
| |
− | #
| |
− | $extractdir/gribextract -c 167 -l 1,0,0 $file $savedir/tempT2_ ##${file:69:10}
| |
− | cdo invertlat $fildir/tempT2_00.grb $savedir/tempT2_invert00.grb
| |
− | cdo invertlat $fildir/tempT2_06.grb $savedir/tempT2_invert06.gr
| |
− | cdo invertlat $fildir/tempT2_12.grb $savedir/tempT2_invert12.grb
| |
− | cdo invertlat $fildir/tempT2_18.grb $savedir/tempT2_invert18.grb
| |
− | cdo mulc,1.0 $savedir/tempT2_00.grb $savedir/T2D/T2_${file:69:10} #nT2/T2_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempT2_06.grb $savedir/T2D/T2_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempT2_12.grb $savedir/T2D/T2_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempT2_18.grb $savedir/T2D/T2_${file:69:10}
| |
− | #
| |
− | $extractdir/gribextract -c 134 -l 1,0,0 $file $savedir/tempPa_ ##${file:69:10}
| |
− | cdo invertlat $fildir/tempPa_00.grb $savedir/tempPa_invert00.grb
| |
− | cdo invertlat $fildir/tempPa_06.grb $savedir/tempPa_invert06.gr
| |
− | cdo invertlat $fildir/tempPa_12.grb $savedir/tempPa_invert12.grb
| |
− | cdo invertlat $fildir/tempPa_18.grb $savedir/tempPa_invert18.grb
| |
− | cdo mulc,1.0 $savedir/tempPa_00.grb $savedir/Pa/Pa_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempPa_06.grb $savedir/Pa/Pa_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempPa_12.grb $savedir/Pa/Pa_${file:69:10}
| |
− | cdo mulc,1.0 $savedir/tempPa_18.grb $savedir/Pa/Pa_${file:69:10}
| |
− | #
| |
− | rm $savedir/temp*
| |
− | done
| |
− | | |
− | <br/>A similar way of doing it is given in split_time.sh, but I am not using it because I need to calculate QQ in between.
| |
− | | |
− | for file in 1979*; do
| |
− | echo $file
| |
− | $cdo selcode,167 $file T2_$file
| |
− | $cdo selcode,165 $file UU_$file
| |
− | $cdo selcode,166 $file VV_$file
| |
− | $cdo selcode,39 $file SM_000_007_$file
| |
− | $cdo selcode,40 $file SM_007_028_$file
| |
− | $cdo selcode,41 $file SM_028_100$file
| |
− | $cdo selcode,42 $file SM_100-250$file
| |
− | $cdo selcode,139 $file ST_000_007$file
| |
− | $cdo selcode,170 $file ST_007_028$file
| |
− | $cdo selcode,183 $file ST_028_100$file
| |
− | $cdo selcode,236 $file ST_100_250$file
| |
− | $cdo selcode,31 $file SIce_$file
| |
− | $cdo selcode,172 $file LandS_$file
| |
− | $cdo selcode,129 $file tempZ_$file
| |
− | #Is in m^3/s^3, need to divide by g, 9.80665, see EC FAQ
| |
− | $cdo divc,9.80665 tempZ_$file Z_$file
| |
− | $cdo selcode,235 $file SkinT_$file
| |
− | $cdo selcode,228 $file tempTP_$file
| |
− | $cdo divc,10800.0 tempTP_$file TP_$file
| |
− | $cdo selcode,169 $file tempSW_$file
| |
− | $cdo divc,10800.0 tempSW_$file SW_$file #Average over previous 3 hours 1/(3*60*60)
| |
− | $cdo selcode,175 $file tempLW_$file
| |
− | $cdo divc,10800.0 tempLW_$file LW_$file
| |
− | $cdo selcode,141 $file tempSD_$file
| |
− | #snow water eq (in m, should be kg/m^2)
| |
− | #SWE (kg/m2) = snow depth (m) x snow density (kg/m3), SWE (m) = snow depth (m) x snow density (kg/m3) / water density (kg/m3)
| |
− | $cdo mulc,1000.0 tempSD_$file SD_$file
| |
− | $cdo selcode,168 $file Dew_$file
| |
− | done
| |
− | | |
− | | |
− | | |
− | === Invert latitude ===
| |
− | | |
− | The Era-Interim files and the GFS data have different coordinates. In the Era-Interim files, origo (i=0, j=0) are found in the upper left corner, whereas in GFS, origo is in the lower left corner. This is fixed (in TdPa2QQ.sh and extract_var_after_split.sh) by writing
| |
− | | |
− | $cdo invertlat $file ${file:(-10)} # :(-10)} removes ma from filename
| |
− | | |
− | Please note that if the GRIB files contain several grids (atmospheric as well as ground), this inversion must be done after having split the files into variables. We are interested in the ground grid (but as long as no atmospheric fields are contained in the rawdata, invertlat can be done in split_time.sh).
| |
− | | |
− | | |
| | | |
| == 9) Create WPS files == | | == 9) Create WPS files == |
| + | Log on to Abel and run WPS/WRF/Noah there. Make sure that the preprocessed data is transferred to Abel. |
| | | |
| From [http://www.ral.ucar.edu/research/land/technology/lsm/HRLDAS_USERS_GUIDE_34.pdf the HRLDAS user's guide]: "The strategy is to set up model grids and model input files as for a WRF simulation, but then use these files to run HRLDAS instead of WRF." | | From [http://www.ral.ucar.edu/research/land/technology/lsm/HRLDAS_USERS_GUIDE_34.pdf the HRLDAS user's guide]: "The strategy is to set up model grids and model input files as for a WRF simulation, but then use these files to run HRLDAS instead of WRF." |
Line 525: |
Line 215: |
| ./Noah_hrldas_beta | | ./Noah_hrldas_beta |
| | | |
− | It requires namelist.hrldas, input files on the format <YYYYMMDDHH>.LDASIN_DOMAIN<>, various .TBL files and wrfinput_d01. | + | It requires namelist.hrldas, input files on the format <YYYYMMDDHH>.LDASIN_DOMAIN<>, various .TBL files and wrfinput_d01.<br/> |
| | | |
| [[Category:Models]]<br/>[[Category:WRF]]<br/>[[Category:NOAH]] | | [[Category:Models]]<br/>[[Category:WRF]]<br/>[[Category:NOAH]] |
A list of steps needed to run HRLDAS is given below. The default forcing data to run NOAH/HRLDAS is GFS surface data, produced by NCEP. If you'd instead like to use ERA-Interim, a list of preprocessing steps must be undertaken. These preprocessing steps are marked with an asterix.
After having received the data, please have a look at the file and make sure that all variables are present. Also, check that all pressure levels are present (there should be 38 pressure levels, starting with 1,2,3,5,7,10,20 hPa... and ending with 950, 975, 1000 hPa).
If you order .mars files (GRIB files) from Bjørg, she'll give you a file path to where they are saved, for instance
To access these data, log on to sverdrup or any other remote machine except Abel. You won't have acess to the files from Abel or your local computer. (Note: you may also have to include /net/vann to the start of the file path.)
Then open a new terminal, go to the data folder and unzip the data. It is advisable to put all new data into a folder called "rawfiles", because the data can then be accessed by the scripts. If you already have your old data stored in rawfiles; change the name of this folder to a more specific name. (that is:
Log on to Abel and run WPS/WRF/Noah there. Make sure that the preprocessed data is transferred to Abel.
Consider creating a new folder within WPS/ for your runs. Copy the namelist.wps into this folder and make sure that the GEOGRID.TBL and METGRID.TBL files are available by adding these lines to the namelist.wps:
In the folder /WRF/WPS, prepare namelist.wps according to your domain. The Java program WRFPortal may help finding coordinates.
This generates a geo_em_d01.nc file. You may check it by typing
To change the geo_em file, change namelist.wps and run ./geogrid.exe again.
Remember to create links to the gribfiles and the Vtables.
After having created the intermediate files, make sure that all variables are present.
If not, the Vtable must be changed accordingly.
Remember to create links to the Metgrid.tbl table.
After having created the NetCDF files met_em*, make sure that all variables are present.
If not, the Vtable must be changed accordingly.
If you already have a wrfinput_d01... file, you may check it by typing
After having created the NetCDF files wrfinput, make sure that all variables are present.
Next, run consolidate_grib.exe to consolidate the input files.
Consider creating a new folder within hrldas/hrldas-v3.4.1/HRLDAS_COLLECT_DATA. Copy the namelist.input into this folder. Also, change the GRIB1_PARAMETER_TABLE to be suitable for Era-Interim data. Use the version 128 from this page: rda.ucar.edu/docs/formats/grib/gribdoc/ecmwf_params.html
After having created the NetCDF files <yyyymmddhh>LDASIN, make sure that all variables are present.
Consider creating a new folder within hrldas/hrldas-v3.4.1/Run. Copy the namelist.hrldas and all tables *.TBL into this folder.
It requires namelist.hrldas, input files on the format <YYYYMMDDHH>.LDASIN_DOMAIN<>, various .TBL files and wrfinput_d01.