Difference between revisions of "Category:MITgcm"

From mn/geo/geoit
Jump to: navigation, search
Line 59: Line 59:
  
  
 +
= Running small test cases on abel: =
 +
 +
Once compiled, mitgcmuv executable needs to be moved to the run directory:
 +
 +
 +
<pre>mv mitgcmuv ../run</pre>
 
Before running MITgcm, you need to prepare your input files. All these input files are located in the input directory (exp2/input) but when running, they must be available in the build directory. Instead of copying all the input files into build, we create symbolic links:
 
Before running MITgcm, you need to prepare your input files. All these input files are located in the input directory (exp2/input) but when running, they must be available in the build directory. Instead of copying all the input files into build, we create symbolic links:
<pre>ln -s ../input/* .
+
<pre>cd ../run
 +
</pre><pre>ln -s ../input/* .
 
</pre>
 
</pre>
 
<br/>You are now ready to run mitgcmuv. Small configurations (most tutorial examples) can be run interactively:
 
<br/>You are now ready to run mitgcmuv. Small configurations (most tutorial examples) can be run interactively:
Line 69: Line 76:
 
<pre>tail STDOUT.0000</pre>
 
<pre>tail STDOUT.0000</pre>
 
(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Min. Y spins =&nbsp;&nbsp;&nbsp;&nbsp; 1000000000<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Total. Y spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 0<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Avg. Y spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 0.00E+00<br/>(PID.TID 0000.0001) // o Thread number: 000001<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; No. barriers =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 10092<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Max. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Min. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp; Total barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 10092<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Avg. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1.00E+00<br/>PROGRAM MAIN: Execution ended Normally
 
(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Min. Y spins =&nbsp;&nbsp;&nbsp;&nbsp; 1000000000<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Total. Y spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 0<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Avg. Y spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 0.00E+00<br/>(PID.TID 0000.0001) // o Thread number: 000001<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; No. barriers =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 10092<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Max. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Min. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp; Total barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 10092<br/>(PID.TID 0000.0001) //&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Avg. barrier spins =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 1.00E+00<br/>PROGRAM MAIN: Execution ended Normally
 +
 +
 +
 +
For this experiment exp2, several output files are created. By default, and at least for this experiment (exp2), outputs are in binary format. You'll get several files for each variable. Let's take T (Temperature):
 +
<pre>ls T.*
 +
</pre>
 +
T.0000000000.001.001.data&nbsp; T.0000000000.002.002.data&nbsp; T.0000000024.002.001.data&nbsp; T.0000000026.001.002.data<br/>T.0000000000.001.001.meta&nbsp; T.0000000000.002.002.meta&nbsp; T.0000000024.002.001.meta&nbsp; T.0000000026.001.002.meta<br/>T.0000000000.001.002.data&nbsp; T.0000000024.001.001.data&nbsp; T.0000000024.002.002.data&nbsp; T.0000000026.002.001.data<br/>T.0000000000.001.002.meta&nbsp; T.0000000024.001.001.meta&nbsp; T.0000000024.002.002.meta&nbsp; T.0000000026.002.001.meta<br/>T.0000000000.002.001.data&nbsp; T.0000000024.001.002.data&nbsp; T.0000000026.001.001.data&nbsp; T.0000000026.002.002.data<br/>T.0000000000.002.001.meta&nbsp; T.0000000024.001.002.meta&nbsp; T.0000000026.001.001.meta&nbsp; T.0000000026.002.002.meta
 +
 +
 +
 +
meta files (*.meta) are metadata files i.e. text files containing some information about the data files (*.data). *.data are binary files containing Temperature values.
 +
 +
T.[timestep].[X].[Y].meta
 +
 +
T.[timestep].[X].[Y].data
 +
 +
 +
 +
Where timestep is the actual timestep and valid for the stored variable.
 +
 +
X, Y correspond to the subgrids as defined in code/SIZE.h
 +
 +
In our example (check code/SIZE.h), we have 2 x 2 subgrids, this is why we have 4 files per variable for a given timestep.
 +
 +
 +
 +
 +
 +
= Getting MITgcm netCDF outputs: =
 +
 +
It is recommended to generate [http://www.unidata.ucar.edu/software/netcdf/ netCDF] outputs instead of MITgcm binary outputs. It will make it easier to analyze and visualize your outputs and it is more portable.
 +
 +
 +
 +
For this, you need to enable netCDF when compiling MITgcm.&nbsp; Several examples are meant to create netCDF outputs instead of standard MITgcm binary outputs (global_ocean.90x40x15, etc.) but let's assume we want to generate netCDF outputs for our example exp2.
 +
 +
In our exp2 example, we first clean any previous compilation in build directory:
 +
<pre>cd exp2
 +
 +
rm -rf build
 +
 +
mkdir build
 +
 +
cd build</pre>
 +
 +
 +
And then we recompile with -enable=nmc option:
 +
<pre>genmake2 -mpi -enable=mnc -mods=../code/ -of=optfile.sh
 +
 +
make depend
 +
 +
make >& compile.log</pre>
 +
 +
 +
(don't forget to create optfile.sh...).
 +
 +
 +
 +
When running, netCDF (NMC outputs) needs to be activated:
 +
<pre>cd exp2/input
 +
 +
cp ../../global_ocean.90x40x15/input/data.mnc .
 +
 +
cp ../../global_ocean.90x40x15/input/data.diagnostics .</pre>
 +
 +
 +
Please note that all these files can be customized (see MITgcm documentation). Here is copy existing files...
 +
 +
 +
 +
And edit data.pkg where you add "useMNC=.TRUE.," i.e.:
 +
<pre># Packages
 +
&PACKAGES
 +
  useMNC=.TRUE.,
 +
&
 +
 +
</pre>
 +
This latter option indicates, you wish to generate netCDF outputs at runtime (by default, binary outputs are generated even though you have compiled your code with netCDF).
 +
 +
When running mitgcmuv, netCDF outputs are generated in a subdirectory; In our example it is called mnc_test_0001 (this is what is specified in data.mnc; you can change to a more meaningful name!).
 +
 +
As before, you'll get a netCDF file per subgrid (tile) but this time, one netcdf file contains several variables and several timesteps. To make them easier to analyse, use python and gluemncbig:
 +
 +
module load python/anaconda
 +
 +
gluemncbig -o grid.nc mnc_test_0001/grid.*.nc<br/>gluemncbig -o state.nc mnc_test_0001/state.*.nc<br/>gluemncbig -o phiHyd.nc mnc_test_0001/phiHyd.*.nc<br/>
 +
 +
Then the resulting files can be viewed for instance with ncview:
 +
<pre>module load ncview
 +
 +
ncview state.nc</pre>
 +
 +
 +
(if python/anaconda is not available, this is likely because you need to set-up your local environment properly. Please contact drift@geo.uio.no for getting help).
 +
 +
 +
 +
Full documentation for NMC is available at [http://mitgcm.org/public/r2_manual/latest/online_documents/node273.html]
 +
 +
 +
 +
  
 
If you need to install MITgcm on another machine, please contact drift@geo.uio.no
 
If you need to install MITgcm on another machine, please contact drift@geo.uio.no

Revision as of 16:26, 1 July 2015

Websites:

http://mitgcm.org/

http://mitgcm.org/public/r2_manual/latest/

Getting MITgcm source code:

export CVSROOT=':pserver:cvsanon@mitgcm.org:/u/gcmpack'


#To get MITgcm through CVS, first register with the MITgcm CVS server using command and CVS password: cvsanon

cvs login

#You only need to do a cvs login once. To obtain the latest sources type:

cvs co -P MITgcm

Installation on abel:

Make sure you have downloaded MITgcm source code and let's assume your code is $HOME/MITgcm:

cd $HOME/MITgcm

export ROOTDIR=$HOME/MITgcm

export PATH=$ROOTDIR/tools:$PATH

module load netcdf.intel
module switch openmpi.intel/1.6.1 openmpi.intel/1.8


Once you have set-up your environment, you are ready to set-up a use case and run MITgcm.  All MITgcm examples and use-cases are in the verification directory. Feel free to test another example; close to what you wish to run in "real life". As a first example, we will compile and run exp2:

cd $HOME/MITgcm/verification/exp2

genmake2 -mpi -mods=../code/ -of=optfile.sh


where optfile.sh contains specific settings for abel. You need to create this file and add it should contain:

#!/bin/bash
export LANG=en_US.UTF-8
export LC_ALL=en_US
module load netcdf.intel
module switch openmpi.intel/1.6.1 openmpi.intel/1.8
module load jasper
module load ncl/6.1.0
export FC=mpif90
export F90C=mpif90
export CC=mpicc
export DEFINES='-DWORDLENGTH=1 -D_BYTESWAPIO'
#
export NETCDF=/cluster/software/VERSIONS/netcdf.intel-4.2.1.1
export HDF5=/cluster/software/VERSIONS/hdf5-1.8.9_intel
export NCARG_ROOT=/cluster/software/VERSIONS/ncl-6.1.0
export JASPERLIB=/cluster/software/VERSIONS/jasper-1.900.1/lib
export JASPERINC=/cluster/software/VERSIONS/jasper-1.900.1/include/jasper
export LIBS="-L${NETCDF}/lib -lnetcdff -lnetcdf"


Then to compile:

make depend

make >& compile.log


If everything went well, your build directory should contain the MITgcm executable (called mitgcmuv).


Running small test cases on abel:

Once compiled, mitgcmuv executable needs to be moved to the run directory:


mv mitgcmuv ../run

Before running MITgcm, you need to prepare your input files. All these input files are located in the input directory (exp2/input) but when running, they must be available in the build directory. Instead of copying all the input files into build, we create symbolic links:

cd ../run
ln -s ../input/* .


You are now ready to run mitgcmuv. Small configurations (most tutorial examples) can be run interactively:

./mitgcmuv


A sucessful run ends with:

NORMAL END


Check STDOUT.0000 and search for "Execution ended Normally".

tail STDOUT.0000

(PID.TID 0000.0001) //            Min. Y spins =     1000000000
(PID.TID 0000.0001) //          Total. Y spins =              0
(PID.TID 0000.0001) //            Avg. Y spins =       0.00E+00
(PID.TID 0000.0001) // o Thread number: 000001
(PID.TID 0000.0001) //            No. barriers =          10092
(PID.TID 0000.0001) //      Max. barrier spins =              1
(PID.TID 0000.0001) //      Min. barrier spins =              1
(PID.TID 0000.0001) //     Total barrier spins =          10092
(PID.TID 0000.0001) //      Avg. barrier spins =       1.00E+00
PROGRAM MAIN: Execution ended Normally


For this experiment exp2, several output files are created. By default, and at least for this experiment (exp2), outputs are in binary format. You'll get several files for each variable. Let's take T (Temperature):

ls T.*

T.0000000000.001.001.data  T.0000000000.002.002.data  T.0000000024.002.001.data  T.0000000026.001.002.data
T.0000000000.001.001.meta  T.0000000000.002.002.meta  T.0000000024.002.001.meta  T.0000000026.001.002.meta
T.0000000000.001.002.data  T.0000000024.001.001.data  T.0000000024.002.002.data  T.0000000026.002.001.data
T.0000000000.001.002.meta  T.0000000024.001.001.meta  T.0000000024.002.002.meta  T.0000000026.002.001.meta
T.0000000000.002.001.data  T.0000000024.001.002.data  T.0000000026.001.001.data  T.0000000026.002.002.data
T.0000000000.002.001.meta  T.0000000024.001.002.meta  T.0000000026.001.001.meta  T.0000000026.002.002.meta


meta files (*.meta) are metadata files i.e. text files containing some information about the data files (*.data). *.data are binary files containing Temperature values.

T.[timestep].[X].[Y].meta

T.[timestep].[X].[Y].data


Where timestep is the actual timestep and valid for the stored variable.

X, Y correspond to the subgrids as defined in code/SIZE.h

In our example (check code/SIZE.h), we have 2 x 2 subgrids, this is why we have 4 files per variable for a given timestep.



Getting MITgcm netCDF outputs:

It is recommended to generate netCDF outputs instead of MITgcm binary outputs. It will make it easier to analyze and visualize your outputs and it is more portable.


For this, you need to enable netCDF when compiling MITgcm.  Several examples are meant to create netCDF outputs instead of standard MITgcm binary outputs (global_ocean.90x40x15, etc.) but let's assume we want to generate netCDF outputs for our example exp2.

In our exp2 example, we first clean any previous compilation in build directory:

cd exp2

rm -rf build

mkdir build

cd build


And then we recompile with -enable=nmc option:

genmake2 -mpi -enable=mnc -mods=../code/ -of=optfile.sh

make depend

make >& compile.log


(don't forget to create optfile.sh...).


When running, netCDF (NMC outputs) needs to be activated:

cd exp2/input

cp ../../global_ocean.90x40x15/input/data.mnc .

cp ../../global_ocean.90x40x15/input/data.diagnostics .


Please note that all these files can be customized (see MITgcm documentation). Here is copy existing files...


And edit data.pkg where you add "useMNC=.TRUE.," i.e.:

# Packages
 &PACKAGES
  useMNC=.TRUE.,
 &

This latter option indicates, you wish to generate netCDF outputs at runtime (by default, binary outputs are generated even though you have compiled your code with netCDF).

When running mitgcmuv, netCDF outputs are generated in a subdirectory; In our example it is called mnc_test_0001 (this is what is specified in data.mnc; you can change to a more meaningful name!).

As before, you'll get a netCDF file per subgrid (tile) but this time, one netcdf file contains several variables and several timesteps. To make them easier to analyse, use python and gluemncbig:

module load python/anaconda

gluemncbig -o grid.nc mnc_test_0001/grid.*.nc
gluemncbig -o state.nc mnc_test_0001/state.*.nc
gluemncbig -o phiHyd.nc mnc_test_0001/phiHyd.*.nc

Then the resulting files can be viewed for instance with ncview:

module load ncview

ncview state.nc


(if python/anaconda is not available, this is likely because you need to set-up your local environment properly. Please contact drift@geo.uio.no for getting help).


Full documentation for NMC is available at [1]



If you need to install MITgcm on another machine, please contact drift@geo.uio.no

This category currently contains no pages or media.