Difference between revisions of "OpenIFS"
Line 39: | Line 39: | ||
<pre>module list | <pre>module list | ||
Currently Loaded Modulefiles: | Currently Loaded Modulefiles: | ||
− | 1) use.own | + | 1) use.own 3) openmpi.intel/1.6.1 5) hdf5/1.8.9_intel 7) grib_api/1.12.3 9) udunits/2.2.16 11) intelmpi.intel/4.1.3 |
− | 2) intel/2011.10 | + | 2) intel/2011.10 4) python2/2.7.3 6) netcdf.intel/4.2.1.1 8) perlmodules/5.10_2 10) intel/2013.sp1.1 12) openifs/38r1v04 |
</pre> | </pre> | ||
Line 510: | Line 510: | ||
Once you have updated your account, you can run this example by submitting '''getEIdata.job''': | Once you have updated your account, you can run this example by submitting '''getEIdata.job''': | ||
<pre>sbatch getEIdata.job</pre> | <pre>sbatch getEIdata.job</pre> | ||
− | + | <br/><span style="color:#ff0000">All the other jobs will be submitted automatically.</span> | |
− | |||
− | <span style="color:#ff0000">All the other jobs will be submitted automatically.</span> | ||
Line 532: | Line 530: | ||
This example is very similar to the previous example: | This example is very similar to the previous example: | ||
<pre> | <pre> | ||
− | |||
mkdir examples | mkdir examples | ||
Line 546: | Line 543: | ||
getEIdata.job ifsnam_fc_template ifsnam_fpos_template initrun Readme.md runInterpolation.job runOpenIFS.job</pre> | getEIdata.job ifsnam_fc_template ifsnam_fpos_template initrun Readme.md runInterpolation.job runOpenIFS.job</pre> | ||
− | + | <br/><br/>The main difference is that now openIFS will generate extra fields that are necessary for OSLOCTM (-v norway when running master.exe; see runOpenIFS.job). | |
− | |||
− | <br/>The main difference is that now openIFS will generate extra fields that are necessary for OSLOCTM (-v norway when running master.exe; see runOpenIFS.job). | ||
Line 560: | Line 555: | ||
Once you have updated your account, you can run this example by submitting '''getEIdata.job''': | Once you have updated your account, you can run this example by submitting '''getEIdata.job''': | ||
<pre>sbatch getEIdata.job</pre> | <pre>sbatch getEIdata.job</pre> | ||
− | + | <br/><span style="color:#ff0000">All the other jobs will be submitted automatically.</span> | |
− | |||
− | <span style="color:#ff0000">All the other jobs will be submitted automatically.</span> | ||
Revision as of 11:37, 8 December 2014
OpenIFS - Open Integrated Forecasting System
OpenIFS provides academic and research institutions with an easy-to-use version of the ECMWF IFS (Integrated Forecasting System). OpenIFS provides the forecast capability of IFS (no data assimilation), supporting software and documentation. OpenIFS has a support team at ECMWF for technical assistance but limited resources for detailed scientific assistance.
About OpenIFS provides more information about the model.
First check openifs availability on your platform. Currently openIFS is available on UIO HPC (abel.uio.no) only.
module avail openifs ----------------------- /cluster/etc/modulefiles ---------------------------- openifs/38r1v04(default)
If nothing is returned, it is likely openifs is not available. If you think it should, then contact us (drift@uio.no).
To load a given openifs version:
module load openifs/38r1v04
We suggest you specify the version you wish to use to avoid any problems if we install a new default version (as it is important to stick to the very same version for your simulations).
When loading openifs the following environment variables are defined:
- OIFS_CYCLE: ECMWF cycle such as 38r1 (version 38, revision 1)
- OIFS_HOME: directory where openIFS has been installed; this can be used to check openIFS sources
- OIFS_DATA: contains openIFS static input data. Even if you use your own compiled version of openIFS, it is NOT necessary to download these files again.
- OIFS_EXAMPLES: directory where all openIFS examples are stored. If you wish to run one of these examples see our dedicated section on running tutorials.
- OIFS_COMP: compilers used to create openIFS executables. We currently use intel compilers.
- OIFS_BUILD: type of build. It triggers a set of compiler options when building openIFS. Currently set to "opt"
- OIFS_GRIB_API_DIR: GRIB-API version used by openIFS to encode/decode GRIB files
- MARS_LSM_PATH: land sea mask used
- EMOS_VERSION: EMOS version (compiled with GRIB-API support). EMOS is used for interpolation routines.
- EMOS_HOME: directory where EMOS has been installed.
- BUFR_TABLES: bufrtables used for decoding/encoding BUFR (not used in the forecast model)
- CREX_TABLES: crex tables used for encoding/decoding CREX data.
For building openIFS, we use intel compilers:
module list Currently Loaded Modulefiles: 1) use.own 3) openmpi.intel/1.6.1 5) hdf5/1.8.9_intel 7) grib_api/1.12.3 9) udunits/2.2.16 11) intelmpi.intel/4.1.3 2) intel/2011.10 4) python2/2.7.3 6) netcdf.intel/4.2.1.1 8) perlmodules/5.10_2 10) intel/2013.sp1.1 12) openifs/38r1v04
Running openIFS:
The main openIFS program (forecast model) is called master.exe but as for most models, several steps have to be done before the first call to master.exe!
The main steps are:
- get meteorological input fields (see examples on how to get ECMWF reanalysis data with ecmwfapi).
- interpolate input fields to the chosen horizontal and vertical resolution
- run forecast model to generate output fields
To perform these steps, some python scripts have been developed:
Step-1: get meteorological input fields
- getEIdata.py
This script retrieves the ERA-Interim data required to run openIFS from ECMWF Meteorological archive (MARS). You need to make sure ecmwfapi is properly set-up for your user account. See here how to get ECMWF API keys. If you have any problem contact drift@geo.uio.no
To use getEIdata.py, you need to specify the retrieval dates (at least the starting date). For instance:
getEIdata.py --start_date=20030103
the above command retrieves ERA-Interim data for the 3rd January 2003 for time=00 and write them in $WORKDIR/20030103/00
- EI_shml.grb: contains spherical harmonics fields on model levels
- EI_ggml.grb: contains model levels fields on a reduced gaussian grid
- EI_iniggsfc.grb: contains surface data on a reduced gaussian grid
The table below gives a short description of the parameters contained in these 3 files:
Shortname |
Description |
Retrieved/Computed/Climate |
File |
lnsp |
Logarithm of surface pressure |
Retrieved from MARS |
EI_shml.grb |
t |
Temperature |
Retrieved from MARS |
EI_shml.grb |
vo |
Vorticity (relative) |
Retrieved from MARS |
EI_shml.grb |
d |
Divergence |
Retrieved from MARS |
EI_shml.grb |
o3 |
Ozone mass mixing ratio |
Retrieved from MARS | EI_ggml.grb |
clwc |
Specific cloud liquid water content |
Retrieved from MARS | EI_ggml.grb |
ciwc |
Specific cloud ice water content |
Retrieved from MARS | EI_ggml.grb |
cc |
Cloud cover |
Retrieved from MARS |
EI_ggml.grb |
q |
Specific humidity |
Retrieved from MARS | EI_ggml.grb |
crwc |
Specific rain water content |
Computed (Initialized to 0) |
EI_ggml.grb |
cswc |
Specific snow water content |
Computed (Initialized to 0) | EI_ggml.grb |
swvl1 |
Volumetric soil water layer 1 |
Retrieved from MARS and Rescale soil moisture from TESSEL to HTESSEL |
EI_iniggsfc.grb |
swvl2 |
Volumetric soil water layer 2 |
" |
EI_iniggsfc.grb |
swvl3 |
Volumetric soil water layer 3 |
" |
EI_iniggsfc.grb |
swvl4 |
Volumetric soil water layer 4 |
" |
EI_iniggsfc.grb |
stl1 |
Soil temperature level 1 |
Retrieved from MARS |
EI_iniggsfc.grb |
stl2 |
Soil temperature level 2 |
Retrieved from MARS |
EI_iniggsfc.grb |
stl3 |
Soil temperature level 3 |
Retrieved from MARS |
EI_iniggsfc.grb |
stl4 |
Soil temperature level 4 |
Retrieved from MARS |
EI_iniggsfc.grb |
skt |
Skin temperature |
Retrieved from MARS |
EI_iniggsfc.grb |
tsn |
Temperature of snow layer |
Retrieved from MARS |
EI_iniggsfc.grb |
sd |
Snow depth |
Retrieved from MARS and adjusted with 10 metre wind speed |
EI_iniggsfc.grb |
rsn |
Snow density |
Retrieved from MARS and adjusted with 10 metre wind speed | EI_iniggsfc.grb |
asn |
Snow albedo |
Retrieved from MARS |
EI_iniggsfc.grb |
src |
Skin reservoir content |
Retrieved from MARS |
EI_iniggsfc.grb |
ci |
Sea-ice cover |
Retrieved from MARS |
EI_iniggsfc.grb |
sst |
Sea surface temperature |
Retrieved from MARS |
EI_iniggsfc.grb |
istl1 |
Ice temperature layer 1 |
Retrieved from MARS | EI_iniggsfc.grb |
istl2 |
Ice temperature layer 2 |
Retrieved from MARS | EI_iniggsfc.grb |
istl3 |
Ice temperature layer 3 |
Retrieved from MARS | EI_iniggsfc.grb |
istl4 |
Ice temperature layer 4 |
Retrieved from MARS | EI_iniggsfc.grb |
anor |
Angle of sub-gridscale orography |
Climate data |
EI_iniggsfc.grb |
isor |
Anisotropy of sub-gridscale orography |
Climate data | EI_iniggsfc.grb |
slor |
Slope of sub-gridscale orography |
Climate data | EI_iniggsfc.grb |
sdor |
Standard deviation of orography |
Climate data | EI_iniggsfc.grb |
sr |
Surface roughness |
Climate data | EI_iniggsfc.grb |
lsrh |
Logarithm of surface roughness length for heat |
Climate data | EI_iniggsfc.grb |
cvh |
High vegetation cover |
Climate data | EI_iniggsfc.grb |
cvl |
Low vegetation cover |
Climate data | EI_iniggsfc.grb |
tvh |
Type of high vegetation |
Climate data | EI_iniggsfc.grb |
al |
Albedo |
Climate data and time interpolated from monthly mean |
EI_iniggsfc.grb |
aluvp |
UV visible albedo for direct radiation |
" |
EI_iniggsfc.grb |
aluvd |
UV visible albedo for diffuse radiation |
" |
EI_iniggsfc.grb |
alnip |
Near IR albedo for direct radiation |
" |
EI_iniggsfc.grb |
alnid |
Near IR albedo for diffuse radiation |
" |
EI_iniggsfc.grb |
lai_lv |
Leaf area index, low vegetation |
" |
EI_iniggsfc.grb |
lai_hv |
Leaf area index, high vegetation |
" |
EI_iniggsfc.grb |
sdfor |
Standard deviation of filtered subgrid orography |
static data |
EI_iniggsfc.grb |
lsm |
Land-sea mask |
static data |
EI_iniggsfc.grb |
z |
Geopotential |
static data |
EI_iniggsfc.grb |
slt |
Soil type |
static data |
EI_iniggsfc.grb |
CheckECMWF parameter database to get a more extensive description of all available parameters.
TIPS: to get the usage of these python scripts, use -h or --help. For instance:
getEIdata.py -h Usage: getEIdata.py --start_date=YYYYMMDD [--end_date=YYYYMMDD] [--times=tt1/tt2/tt3] [--grid_type=gtype] [--levels=nlevels] [--sfc=sfc.grb] [--ggml=ggml.grb] [--shml=shml.grb] [--inputdir=input_directory] [--outputdir=output_directory] Options: -h, --help show this help message and exit -s start_date, --start_date=start_date start date YYYYMMDD -e end_date, --end_date=end_date end_date YYYYMMDD -t times, --times=times times such as 00/12 -l levels, --levels=levels number of levels i.e. 60 --sfc=sfc output filename where surface data will be stored --ggml=ggml output filename where upper data on reduced gaussion grid will be stored --shml=shml output filename where shperical harmonic upper data will be stored --inputdir=inputdir root directory for reading input files (valid when retrieve=no) --outputdir=outputdir root directory for storing output files --grid_type=grid_type grid type default is l_2
For each of these python scripts, some arguments are optionals and indicated in squared brackets such as end_date.
Step-2: interpolate input fields to the chosen horizontal and vertical resolution
To interpolate your input fields to the horizontal and vertical resolution of your choice, 3 steps are necessary:
- prepareInterpolationEIdata.py: create a namelist for running openIFS in "interpolation mode" and the inputs necessary to run it
- runInterpolation.py: run openIFS in "interpolation mode" using mpirun to launch the main executable (called master.exe) in parallel.
- postInterpolationEIdata.py: some fields need to be "adjusted" and this is the purpose of this script. Snow depth is adjusted and rescale soil moisture from TESSEL to HTESSEL to preserve land surface evaporation impact.
Step-3: run openIFS forecast model
- prepareOpenifsRun.py: create a namelist for running openIFS in "forecast mode" and check that all the inputs to run it are available.
- runOpenIFS.py: run openIFS in "forecast mode" using mpirun to launch the main executable (called master.exe) in parallel.
You can get the full usage of each of the above scripts with the name of the script followed by -h (or --help). For instance:
prepareInterpolationEIdata.py -h Usage: prepareInterpolationEIdata.py --start_date=YYYYMMDD [--end_date=YYYYMMDD] [--times=tt1/tt2/tt3] [--levels=nlevels] [--expver=expver] [--grid_type=gtype] [--resol=resol] [--ntasks=ntasks] [--inputdir=input_directory] [--namelist_in=namelist_template] [--sfc_in=sfc.grb] [--ggml_in=ggml.grb] [--shml_in=shml.grb] [--outputdir=output_directory] Options: -h, --help show this help message and exit -s start_date, --start_date=start_date start date YYYYMMDD -e end_date, --end_date=end_date end_date YYYYMMDD -t times, --times=times times such as 00/12 -l levels, --levels=levels number of levels i.e. 60 --grid_type=grid_type grid type default is l_2 --resol=resol output resolution --inputdir=inputdir root directory where input files are stored --sfc_in=sfc_in input filename where surface data is stored --ggml_in=ggml_in input filename where upper data on reduced gaussion grid is stored --shml_in=shml_in input filename where shperical harmonic upper data is stored --namelist_in=namelist_in namelist template for running interpolation --outputdir=outputdir root directory for storing output files --ntasks=ntasks number of MPI tasks for running interpolation code --expver=expver expver
openIFS examples:
There are 3 examples currently available:
cd $OIFS_EXAMPLES ls ei osloctm t21test
T21 simple test:
Choose a directory of your choice and copy this simple test:
mkdir -p $HOME/openifs/examples cd $HOME/openifs/examples cp -R $OIFS_EXAMPLES/t21test . cd t21test ls fort.4 ICMGGepc8 ICMGGepc8INIT ICMGGepc8INIUA ICMSHepc8 ICMSHepc8INIT ifsdata job namelists README ref_021_0144 ./job_uio
This is a very short and simple example and it can be run interactively (./job_uio).
All the input fields are already available and by executing job_uio, you run ECMWF openIFS forecast model.
Output fields generated by openIFS are:
ICMSHepc8+000000 ICMGGepc8+000000
SH means Spherical harmonics and GG Gaussian grid. These files are in GRIB-2 format.
You may convert these GRIB files to netCDF using cdo; for instance on sverdrup:
module load cdo/1.6.5.1 cdo -t ecmwf -f nc copy ICMSHepc8+000000 ICMSHepc8+000000.nc
For more information, see ECMWF How to convert OpenIFS outputs to netCDF.
T319 with ERA-Interim input fields:
The aim of this example is to teach you how to run openIFS with ERA-Interim input fields. To try this example on abel:
mkdir examples cd examples module load openifs/38r1v04 cp -R $OIFS_EXAMPLES/ei . cd ei ls getEIdata.job ifsnam_fco3_template ifsnam_fpos_template initrun README runInterpolation.job runOpenIFS.job
This directory contains everything you need to run openIFS on abel:
- getEIdata.job: get ERA-Interim GRIB fields for running openIFS
- runInterpolation.job: interpolate downloaded ERA-Interim GRIB fields to the chosen resolution (T319)
- runOpenIFS.job: run openIFS forecast model (forecast length is set to 48 hours).
You may have to change the account used for running batch job on abel (currently we use geofag queue). To do so, edit getEIdata.job, runInterpolation.job, runOpenIFS.job and update account:
#SBATCH --account=geofag
You need to replace geofag by your Notur account. In case of doubt or problems, contact drift@geo.uio.no
Once you have updated your account, you can run this example by submitting getEIdata.job:
sbatch getEIdata.job
All the other jobs will be submitted automatically.
Once the 3 jobs are done, you get 3 files called slurm-*.out (where * is an integer number corresponding to the jobid).
The local directory will not contain your model outputs. All the model outputs should be located in $WORKDIR/319, with subdirectories for dates (20120801/20120802) and times (00/12).
For instance:
cd $WORKDIR/319/20120801/00
ls -lrt | cut -c 27-
69 Dec 3 14:45 sporog -> /cluster/software/VERSIONS/openifs/ifsdata/38r1/climate/319l_2/sporog
54 Dec 3 14:45 ifsdata -> /cluster/software/VERSIONS/openifs/ifsdata/climatology
69 Dec 3 14:45 const.clim.000 -> /cluster/software/VERSIONS/openifs/ifsdata/38r1/climate/319l_2/lsmoro
63 Dec 3 14:45 319l_2 -> /cluster/software/VERSIONS/openifs/ifsdata/38r1/climate/319l_2/
37676712 Dec 3 14:46 ICMSHb0z5INIT
108466080 Dec 3 14:46 ICMGGb0z5INIUA
11097600 Dec 3 14:46 ICMGGb0z5INIT
4599 Dec 3 14:46 fort.4
76321800 Dec 3 14:47 ICMSHb0z5+000000
46643016 Dec 3 14:47 ICMGGb0z5+000000
76321800 Dec 3 14:53 ICMSHb0z5+000036
46643016 Dec 3 14:53 ICMGGb0z5+000036
76321800 Dec 3 14:55 ICMSHb0z5+000072
46643016 Dec 3 14:55 ICMGGb0z5+000072
76321800 Dec 3 14:56 ICMSHb0z5+000108
46643016 Dec 3 14:56 ICMGGb0z5+000108
76321800 Dec 3 14:58 ICMSHb0z5+000144
1008374 Dec 3 14:58 NODE.001_01
7 Dec 3 14:58 ncf927
15857 Dec 3 14:58 ifs.stat
46643016 Dec 3 14:58 ICMGGb0z5+000144
This directory contains both model inputs (ERA-Interim data with some climate or adjusted fields), openIFS namelist and model outputs for several steps. GRIB fields (ICM*) can be visualized with metview, python, matlab, etc. and their content can be browsed with GRIB-API tools and cdo).
T42 with ERA-Interim input fields for OsloCTM model:
This example is very similar to the previous example:
mkdir examples cd examples module load openifs/38r1v04 cp -R $OIFS_EXAMPLES/osloctm . cd osloctm ls getEIdata.job ifsnam_fc_template ifsnam_fpos_template initrun Readme.md runInterpolation.job runOpenIFS.job
The main difference is that now openIFS will generate extra fields that are necessary for OSLOCTM (-v norway when running master.exe; see runOpenIFS.job).
You may have to change the account used for running batch job on abel (currently we use geofag queue). To do so, edit getEIdata.job, runInterpolation.job, runOpenIFS.job and update account:
#SBATCH --account=geofag
You need to replace geofag by your Notur account. In case of doubt or problems, contact drift@geo.uio.no
Once you have updated your account, you can run this example by submitting getEIdata.job:
sbatch getEIdata.job
All the other jobs will be submitted automatically.
Once the 3 jobs are done, you get 3 files called slurm-*.out (where * is an integer number corresponding to the jobid).
The local directory will not contain your model outputs. All the model outputs should be located in $WORKDIR/42.