Difference between revisions of "Submit Matlab jobs on Abel"

From mn/geo/geoit
Jump to: navigation, search
(Created page with "<div class="vrtx-introduction"> This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at [http://www.uio.no/english/services/it/re...")
 
Line 6: Line 6:
 
For normal off-line non-parallell Matlab running on any Linux server, you can simply do e.g.
 
For normal off-line non-parallell Matlab running on any Linux server, you can simply do e.g.
 
<pre>nohup matlab -nodisplay -nojvm -nodesktop -nosplash < Matlabprog.m  > outdata.txt &
 
<pre>nohup matlab -nodisplay -nojvm -nodesktop -nosplash < Matlabprog.m  > outdata.txt &
</pre>  
+
</pre>
 
For large jobs, this script should set you up OK at the Abel cluster:
 
For large jobs, this script should set you up OK at the Abel cluster:
 
<pre>#!/bin/bash
 
<pre>#!/bin/bash
Line 12: Line 12:
 
#SBATCH --job-name=Matlab
 
#SBATCH --job-name=Matlab
 
#
 
#
# Project:
+
# Project (change to your :
#SBATCH --account=metos
+
#SBATCH --account=geofag
 
#
 
#
 
# Wall clock limit:
 
# Wall clock limit:
Line 20: Line 20:
 
# Max memory usage per core (MB):
 
# Max memory usage per core (MB):
 
#SBATCH --mem-per-cpu=2000M
 
#SBATCH --mem-per-cpu=2000M
</pre> <pre>#
+
#
 
# Number of tasks (cores):
 
# Number of tasks (cores):
 
#SBATCH --ntasks=1
 
#SBATCH --ntasks=1
 +
#
  
 
+
## Setup job environment
## Set up job environment
+
source /cluster/bin/jobsetup
source /site/bin/jobsetup
 
  
 
# Check if we have enough input arguments
 
# Check if we have enough input arguments
if [ $# -lt 2 ]; then
+
</pre><pre>if [ $# -lt 2 ]; then
 
   # No, print a usage message and exit script
 
   # No, print a usage message and exit script
 
   echo echo -e "\nUsage: sbatch submitscript.sh program dataset\n"
 
   echo echo -e "\nUsage: sbatch submitscript.sh program dataset\n"
Line 45: Line 45:
 
PROG=$1
 
PROG=$1
 
DATA=$2
 
DATA=$2
 
+
</pre><pre># Copy files to work directory:
# Copy files to work directory:
 
 
cp $SUBMITDIR/$PROG $SCRATCH
 
cp $SUBMITDIR/$PROG $SCRATCH
 
cp $SUBMITDIR/$DATA $SCRATCH
 
cp $SUBMITDIR/$DATA $SCRATCH

Revision as of 11:49, 5 January 2015

This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at UiOs HPC pages


For normal off-line non-parallell Matlab running on any Linux server, you can simply do e.g.

nohup matlab -nodisplay -nojvm -nodesktop -nosplash < Matlabprog.m  > outdata.txt &

For large jobs, this script should set you up OK at the Abel cluster:

#!/bin/bash
# Job name:
#SBATCH --job-name=Matlab
#
# Project (change to your :
#SBATCH --account=geofag
#
# Wall clock limit:
#SBATCH --time=1000:0:0
#
# Max memory usage per core (MB):
#SBATCH --mem-per-cpu=2000M
#
# Number of tasks (cores):
#SBATCH --ntasks=1
#

## Setup job environment
source /cluster/bin/jobsetup

# Check if we have enough input arguments
if [ $# -lt 2 ]; then
 # No, print a usage message and exit script
 echo echo -e "\nUsage: sbatch submitscript.sh program dataset\n"
 exit

fi

  1. Load matlab module

module load matlab

  1. Make a result directory

RESULT=$SUBMITDIR/Result_`date +%d_%b_%H.%M.%S` mkdir -p $RESULT

  1. Save the name of the matlab program and the dataset

PROG=$1 DATA=$2

# Copy files to work directory:

cp $SUBMITDIR/$PROG $SCRATCH cp $SUBMITDIR/$DATA $SCRATCH

  1. Change directory to the work directory

cd $SCRATCH

  1. Start matlab

matlab -nodisplay -nojvm -nodesktop -nosplash < $PROG

  1. Copy the necessary files from $SCRATCH to the Result directory

cp $SCRATCH/*.txt $RESULT cp $SCRATCH/*.eps $RESULT

  1. End of script