Difference between revisions of "Category:MITgcm"

From mn/geo/geoit
Jump to: navigation, search
 
(10 intermediate revisions by the same user not shown)
Line 18: Line 18:
 
= Installation on abel: =
 
= Installation on abel: =
  
Make sure you have downloaded MITgcm source code and let's assume your code is $HOME/MITgcm:
+
First login on abel.uio.no. From a linux machine:
 +
<pre>
 +
ssh -Y abel.uio.no
 +
</pre>
 +
 
 +
If if fails while you know you have access to abel, please contact us (drift@geo.uio.no).
 +
 
 +
Then make sure you have downloaded MITgcm source code on abel and let's assume your code is $HOME/MITgcm:
 
<pre>cd $HOME/MITgcm
 
<pre>cd $HOME/MITgcm
  
Line 26: Line 33:
  
 
module load netcdf.intel
 
module load netcdf.intel
module switch openmpi.intel/1.6.1 openmpi.intel/1.8</pre>
+
</pre>
 
<br/>Once you have set-up your environment, you are ready to set-up a use case and run MITgcm.&nbsp; All MITgcm examples and use-cases are in the verification directory. Feel free to test another example; close to what you wish to run in "real life". As a first example, we will compile and run '''exp2''':
 
<br/>Once you have set-up your environment, you are ready to set-up a use case and run MITgcm.&nbsp; All MITgcm examples and use-cases are in the verification directory. Feel free to test another example; close to what you wish to run in "real life". As a first example, we will compile and run '''exp2''':
<pre>cd $HOME/MITgcm/verification/exp2
+
<pre>cd $HOME/MITgcm/verification/exp2</pre>
 +
 
 +
You first need to compile MITgcm for this experiment. Compilation (and creation of MITgcm executable) must be done in the build directory:
 +
 
 +
<pre>cd build
  
 
genmake2 -mpi -mods=../code/ -of=optfile.sh</pre>
 
genmake2 -mpi -mods=../code/ -of=optfile.sh</pre>
Line 36: Line 47:
 
export LC_ALL=en_US
 
export LC_ALL=en_US
 
module load netcdf.intel
 
module load netcdf.intel
module switch openmpi.intel/1.6.1 openmpi.intel/1.8
 
 
module load jasper
 
module load jasper
 
module load ncl/6.1.0
 
module load ncl/6.1.0
Line 44: Line 54:
 
export DEFINES='-DWORDLENGTH=1 -D_BYTESWAPIO'
 
export DEFINES='-DWORDLENGTH=1 -D_BYTESWAPIO'
 
#
 
#
export NETCDF=/cluster/software/VERSIONS/netcdf.intel-4.2.1.1
+
export NETCDF=$NETCDF_ROOT
export HDF5=/cluster/software/VERSIONS/hdf5-1.8.9_intel
+
export HDF5=$HDF5_DIR
 
export NCARG_ROOT=/cluster/software/VERSIONS/ncl-6.1.0
 
export NCARG_ROOT=/cluster/software/VERSIONS/ncl-6.1.0
 
export JASPERLIB=/cluster/software/VERSIONS/jasper-1.900.1/lib
 
export JASPERLIB=/cluster/software/VERSIONS/jasper-1.900.1/lib
Line 51: Line 61:
 
export LIBS="-L${NETCDF}/lib -lnetcdff -lnetcdf"
 
export LIBS="-L${NETCDF}/lib -lnetcdff -lnetcdf"
 
</pre>
 
</pre>
<br/>optfile.sh should be placed in your build directory (or if you place if somewhere else, the full path should be provided when calling genmake2).
+
<br/>optfile.sh should be placed in your build directory (or if you place ii somewhere else, the full path should be provided when calling genmake2).
  
 
<br/>Then to compile:
 
<br/>Then to compile:
Line 62: Line 72:
  
 
'''<span style="color:#ff0000">If you need to install MITgcm on another machine, please contact drift@geo.uio.no</span>'''
 
'''<span style="color:#ff0000">If you need to install MITgcm on another machine, please contact drift@geo.uio.no</span>'''
 
 
  
 
= Running small test cases on abel: =
 
= Running small test cases on abel: =
Line 168: Line 176:
  
  
 +
 +
= Choose your resolution =
 +
 +
<span id="result_box" lang="en"><span class="hps">The resolution you</span> <span class="hps">want to use</span> <span class="hps">is set</span> <span class="hps">in</span> <span class="hps">the file called SIZE.h </span><span class="hps">located in the code</span> <span class="hps">folder.</span></span>
 +
 +
 +
<pre>sNx&nbsp;:: No. X points in sub-grid
 +
 +
sNy&nbsp;:: No. Y points in sub-grid.
 +
OLx&nbsp;:: Overlap extent in X. (usually set to 3)
 +
OLy&nbsp;:: Overlat extent in Y. (usually set to 3)
 +
nSx&nbsp;:: No. sub-grids in X.
 +
nSy&nbsp;:: No. sub-grids in Y.
 +
nPx&nbsp;:: No. of processes to use in X. (this value depends on how many CPUs to use on  abel.Number of tasks is nPx * NPY)
 +
nPy&nbsp;:: No. of processes to use in Y.
 +
Nx &nbsp;:: No. points in X for the total domain. (this indicates how many points you have in east-west direction in total. Eg. if you have a degree resolution Nx=360
 +
Ny &nbsp;:: No. points in Y for the total domain. (this indicates how many points you have in north-south direction overall. Eg. if you have a degree resolution Ny=180)
 +
Nr &nbsp;:: No. points in Z for full process domain. (Specify the number of vertical layers)</pre>
 +
 +
 +
<u>NB:</u>
 +
 +
*You need to recompile the model again every time you change anything in SIZE.h.
 +
*You do not need to recompile if you change anything in the inputs directory
 +
*If you wish to run MITgcm at a high resolution, it crucial that the horizontal grid has nearly square grid cells because there is only one viscosity parameter for both i and j direction. See <span id="result_box" lang="en"><span class="hps">[http://mitgcm.org/pipermail/mitgcm-support/2010-January/006442.html] for more information)</span></span>
  
  
Line 184: Line 217:
 
#SBATCH --job-name=run_MITgcm
 
#SBATCH --job-name=run_MITgcm
 
#
 
#
# Project:
+
# Project (change it to your NOTUR or uio project):
 
#SBATCH --account=XXXXX
 
#SBATCH --account=XXXXX
 
#
 
#
# Wall clock limit:
+
# Wall clock limit (to be adjusted!):
 
#SBATCH --time=24:0:0
 
#SBATCH --time=24:0:0
 
#
 
#
Line 193: Line 226:
 
#SBATCH --mem-per-cpu=4G
 
#SBATCH --mem-per-cpu=4G
 
#
 
#
#SBATCH --ntasks=64
+
# Adjust the number of processors (MPI tasks)
###
+
# SBATCH --ntasks=64
 
+
#SBATCH  --exclusive
# Set up job environment: DO NOT CHANGE
+
#
export LANG=en_US.UTF-8
+
#Set up job environment: DO NOT CHANGE
export LC_ALL=en_US
+
export LANG=en_US.UTF-8  
 +
export LC_ALL=en_US  
 
source /cluster/bin/jobsetup
 
source /cluster/bin/jobsetup
 
 
ulimit -l unlimited
 
ulimit -l unlimited
 
 
 
module load netcdf.intel
 
module load netcdf.intel
 
module switch openmpi.intel/1.6.1 openmpi.intel/1.8
 
module switch openmpi.intel/1.6.1 openmpi.intel/1.8
  
 
+
#NOTE cd to where you have MITgcm executable
# NOTE cd to where you have MITgcm executable
 
 
cd $HOME/MITgcm/verification/exp2/run
 
cd $HOME/MITgcm/verification/exp2/run
  
 
mpirun ./mitgcmuv > mitgcmuv.out
 
mpirun ./mitgcmuv > mitgcmuv.out
 
 
</pre>
 
</pre>
 
Please note that you need to adjust account (use your notur account if you have one or uio), time and ntasks (the number of processors required for your MITgcm configuration).
 
Please note that you need to adjust account (use your notur account if you have one or uio), time and ntasks (the number of processors required for your MITgcm configuration).
Line 220: Line 249:
 
<span id="result_box" lang="en"><span class="hps">The number of</span> <span class="hps">tasks</span> <span class="hps">you need depends on your MITgcm configuraton: it can be found in</span> <span class="hps">code/SIZE.h</span> <span class="hps">file</span> <span class="hps">and</span> <span class="hps">it</span> <span class="hps">is<span style="display: none" data-cke-bookmark="1">&nbsp;</span></span> <span style="color:#ff0000"><span class="hps">nPy</span> <span class="hps">*</span> <span class="hps">nPx</span></span></span><span style="display: none" data-cke-bookmark="1">&nbsp;</span>
 
<span id="result_box" lang="en"><span class="hps">The number of</span> <span class="hps">tasks</span> <span class="hps">you need depends on your MITgcm configuraton: it can be found in</span> <span class="hps">code/SIZE.h</span> <span class="hps">file</span> <span class="hps">and</span> <span class="hps">it</span> <span class="hps">is<span style="display: none" data-cke-bookmark="1">&nbsp;</span></span> <span style="color:#ff0000"><span class="hps">nPy</span> <span class="hps">*</span> <span class="hps">nPx</span></span></span><span style="display: none" data-cke-bookmark="1">&nbsp;</span>
  
== Submit and monitor your job command file ==
+
== Submit/monitor your job command file ==
  
 
=== Submit your job ===
 
=== Submit your job ===
Line 226: Line 255:
 
=== Monitor your job ===
 
=== Monitor your job ===
 
<pre>squeue -u $USER</pre>
 
<pre>squeue -u $USER</pre>
=  =
+
<br/>For more information on the batch system on abel follow this [http://www.uio.no/english/services/it/research/hpc/abel/help/user-guide/job-scripts.html link].
  
 
= Troubleshooting: =
 
= Troubleshooting: =

Latest revision as of 14:06, 1 December 2015

Websites:

http://mitgcm.org/

http://mitgcm.org/public/r2_manual/latest/

Getting MITgcm source code:

export CVSROOT=':pserver:cvsanon@mitgcm.org:/u/gcmpack'


#To get MITgcm through CVS, first register with the MITgcm CVS server using command and CVS password: cvsanon

cvs login

#You only need to do a cvs login once. To obtain the latest sources type:

cvs co -P MITgcm

Installation on abel:

First login on abel.uio.no. From a linux machine:

ssh -Y abel.uio.no

If if fails while you know you have access to abel, please contact us (drift@geo.uio.no).

Then make sure you have downloaded MITgcm source code on abel and let's assume your code is $HOME/MITgcm:

cd $HOME/MITgcm

export ROOTDIR=$HOME/MITgcm

export PATH=$ROOTDIR/tools:$PATH

module load netcdf.intel


Once you have set-up your environment, you are ready to set-up a use case and run MITgcm.  All MITgcm examples and use-cases are in the verification directory. Feel free to test another example; close to what you wish to run in "real life". As a first example, we will compile and run exp2:

cd $HOME/MITgcm/verification/exp2

You first need to compile MITgcm for this experiment. Compilation (and creation of MITgcm executable) must be done in the build directory:

cd build

genmake2 -mpi -mods=../code/ -of=optfile.sh


where optfile.sh contains specific settings for abel. You need to create this file and add it should contain:

#!/bin/bash
export LANG=en_US.UTF-8
export LC_ALL=en_US
module load netcdf.intel
module load jasper
module load ncl/6.1.0
export FC=mpif90
export F90C=mpif90
export CC=mpicc
export DEFINES='-DWORDLENGTH=1 -D_BYTESWAPIO'
#
export NETCDF=$NETCDF_ROOT
export HDF5=$HDF5_DIR
export NCARG_ROOT=/cluster/software/VERSIONS/ncl-6.1.0
export JASPERLIB=/cluster/software/VERSIONS/jasper-1.900.1/lib
export JASPERINC=/cluster/software/VERSIONS/jasper-1.900.1/include/jasper
export LIBS="-L${NETCDF}/lib -lnetcdff -lnetcdf"


optfile.sh should be placed in your build directory (or if you place ii somewhere else, the full path should be provided when calling genmake2).


Then to compile:

make depend

make >& compile.log


If everything went well, your build directory should contain the MITgcm executable (called mitgcmuv).


If you need to install MITgcm on another machine, please contact drift@geo.uio.no

Running small test cases on abel:

Once compiled, mitgcmuv executable needs to be moved to the run directory:


mv mitgcmuv ../run

Before running MITgcm, you need to prepare your input files. All these input files are located in the input directory (exp2/input) but when running, they must be available in the build directory. Instead of copying all the input files into build, we create symbolic links:

cd ../run
ln -s ../input/* .


You are now ready to run mitgcmuv. Small configurations (most tutorial examples) can be run interactively:

./mitgcmuv


A sucessful run ends with:

NORMAL END


Check STDOUT.0000 and search for "Execution ended Normally".

tail STDOUT.0000

(PID.TID 0000.0001) //            Min. Y spins =     1000000000
(PID.TID 0000.0001) //          Total. Y spins =              0
(PID.TID 0000.0001) //            Avg. Y spins =       0.00E+00
(PID.TID 0000.0001) // o Thread number: 000001
(PID.TID 0000.0001) //            No. barriers =          10092
(PID.TID 0000.0001) //      Max. barrier spins =              1
(PID.TID 0000.0001) //      Min. barrier spins =              1
(PID.TID 0000.0001) //     Total barrier spins =          10092
(PID.TID 0000.0001) //      Avg. barrier spins =       1.00E+00
PROGRAM MAIN: Execution ended Normally


For this experiment exp2, several output files are created. By default, and at least for this experiment (exp2), outputs are in binary format. You'll get several files for each variable. Let's take T (Temperature):

ls T.*

T.0000000000.001.001.data  T.0000000000.002.002.data  T.0000000024.002.001.data  T.0000000026.001.002.data
T.0000000000.001.001.meta  T.0000000000.002.002.meta  T.0000000024.002.001.meta  T.0000000026.001.002.meta
T.0000000000.001.002.data  T.0000000024.001.001.data  T.0000000024.002.002.data  T.0000000026.002.001.data
T.0000000000.001.002.meta  T.0000000024.001.001.meta  T.0000000024.002.002.meta  T.0000000026.002.001.meta
T.0000000000.002.001.data  T.0000000024.001.002.data  T.0000000026.001.001.data  T.0000000026.002.002.data
T.0000000000.002.001.meta  T.0000000024.001.002.meta  T.0000000026.001.001.meta  T.0000000026.002.002.meta


meta files (*.meta) are metadata files i.e. text files containing some information about the data files (*.data). *.data are binary files containing Temperature values.

T.[timestep].[X].[Y].meta

T.[timestep].[X].[Y].data


Where timestep is the actual timestep and valid for the stored variable.

X, Y correspond to the subgrids as defined in code/SIZE.h

In our example (check code/SIZE.h), we have 2 x 2 subgrids, this is why we have 4 files per variable for a given timestep.


Getting MITgcm netCDF outputs:

It is recommended to generate netCDF outputs instead of MITgcm binary outputs. It will make it easier to analyze and visualize your outputs and it is more portable.


For this, you need to enable netCDF when compiling MITgcm.  Several examples are meant to create netCDF outputs instead of standard MITgcm binary outputs (global_ocean.90x40x15, etc.) but let's assume we want to generate netCDF outputs for our example exp2.

In our exp2 example, we first clean any previous compilation in build directory:

cd exp2

rm -rf build

mkdir build

cd build


And then we recompile with -enable=nmc option:

genmake2 -mpi -enable=mnc -mods=../code/ -of=optfile.sh

make depend

make >& compile.log


(don't forget to create optfile.sh...).


When running, netCDF (NMC outputs) needs to be activated:

cd exp2/input

cp ../../global_ocean.90x40x15/input/data.mnc .

cp ../../global_ocean.90x40x15/input/data.diagnostics .


Please note that all these files can be customized (see MITgcm documentation). Here is copy existing files...


And edit data.pkg where you add "useMNC=.TRUE.," i.e.:

# Packages
 &PACKAGES
  useMNC=.TRUE.,
 &

This latter option indicates, you wish to generate netCDF outputs at runtime (by default, binary outputs are generated even though you have compiled your code with netCDF).

When running mitgcmuv, netCDF outputs are generated in a subdirectory; In our example it is called mnc_test_0001 (this is what is specified in data.mnc; you can change to a more meaningful name!).

As before, you'll get a netCDF file per subgrid (tile) but this time, one netcdf file contains several variables and several timesteps. To make them easier to analyse, use python and gluemncbig:

module load python/anaconda

gluemncbig -o grid.nc mnc_test_0001/grid.*.nc
gluemncbig -o state.nc mnc_test_0001/state.*.nc
gluemncbig -o phiHyd.nc mnc_test_0001/phiHyd.*.nc

Then the resulting files can be viewed for instance with ncview:

module load ncview

ncview state.nc


Full documentation for NMC is available at [1]


Choose your resolution

The resolution you want to use is set in the file called SIZE.h located in the code folder.


sNx :: No. X points in sub-grid

sNy :: No. Y points in sub-grid.
OLx :: Overlap extent in X. (usually set to 3)
OLy :: Overlat extent in Y. (usually set to 3)
nSx :: No. sub-grids in X. 
nSy :: No. sub-grids in Y.
nPx :: No. of processes to use in X. (this value depends on how many CPUs to use on  abel.Number of tasks is nPx * NPY)
nPy :: No. of processes to use in Y.
Nx  :: No. points in X for the total domain. (this indicates how many points you have in east-west direction in total. Eg. if you have a degree resolution Nx=360
Ny  :: No. points in Y for the total domain. (this indicates how many points you have in north-south direction overall. Eg. if you have a degree resolution Ny=180)
Nr  :: No. points in Z for full process domain. (Specify the number of vertical layers)


NB:

  • You need to recompile the model again every time you change anything in SIZE.h.
  • You do not need to recompile if you change anything in the inputs directory
  • If you wish to run MITgcm at a high resolution, it crucial that the horizontal grid has nearly square grid cells because there is only one viscosity parameter for both i and j direction. See [1] for more information)


Running large cases on abel:

When running large cases and more generally for your research, it is best to use netCDF outputs. In addition, you cannot run "interactively" as when login on abel, a limit of 30 minutes CPU is applied. It is also likely you run MITgcm with MPI, using several processors.


Create a job command file

Create a script (or a job command file) where all the resources you need to run MITgcm are specified. Let call it MITgcm.job

#!/bin/bash
# Job name:
#SBATCH --job-name=run_MITgcm
#
# Project (change it to your NOTUR or uio project):
#SBATCH --account=XXXXX
#
# Wall clock limit (to be adjusted!):
#SBATCH --time=24:0:0
#
# Max memory usage per core (MB):
#SBATCH --mem-per-cpu=4G
#
# Adjust the number of processors (MPI tasks)
# SBATCH --ntasks=64
#SBATCH  --exclusive
#
#Set up job environment: DO NOT CHANGE
export LANG=en_US.UTF-8 
export LC_ALL=en_US 
source /cluster/bin/jobsetup
ulimit -l unlimited
module load netcdf.intel
module switch openmpi.intel/1.6.1 openmpi.intel/1.8

#NOTE cd to where you have MITgcm executable
cd $HOME/MITgcm/verification/exp2/run

mpirun ./mitgcmuv > mitgcmuv.out

Please note that you need to adjust account (use your notur account if you have one or uio), time and ntasks (the number of processors required for your MITgcm configuration).

Adjust ntasks

The number of tasks you need depends on your MITgcm configuraton: it can be found in code/SIZE.h file and it is  nPy * nPx 

Submit/monitor your job command file

Submit your job

sbatch MITgcm.job

Monitor your job

squeue -u $USER


For more information on the batch system on abel follow this link.

Troubleshooting:

No python/anaconda modulefile available

if python/anaconda is not available on abel, this is likely because you need to set-up your local environment properly:

Create a directory called privatemodules in your home directory:

mkdir $HOME/privatemodules


Then create (or edit)   $HOME/.modulerc file (please not the "dot" in front of the filename). This file must contain the special modulefile token "#%Module1.0" on the first line followed by one or more module commands:

#%Module1.0

set version 1.0
module load use.own


and any personal module files in this privatemodules directory will become available for you to list, load or unloal.


For python anaconda, create a subdirectory called python:

mkdir $HOME/privatemodules/python


And place the following file called anaconda in this new python directory. If you have any problems, please contact drift@geo.uio.no for getting help.

This category currently contains no pages or media.