Submit Matlab jobs on Abel

From mn/geo/geoit
Jump to: navigation, search

This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at UiOs HPC pages


For normal off-line non-parallell Matlab running on any Linux server, you can simply do e.g.

nohup matlab -nodisplay -nojvm -nodesktop -nosplash < Matlabprog.m  > outdata.txt &

For large jobs, this script should set you up OK at the Abel cluster:

#!/bin/bash
# Job name:
#SBATCH --job-name=Matlab
#
# Project (change to your :
#SBATCH --account=geofag
#
# Wall clock limit:
#SBATCH --time=1000:0:0
#
# Max memory usage per core (MB):
#SBATCH --mem-per-cpu=2000M
#
# Number of tasks (cores):
#SBATCH --ntasks=1
#

## Setup job environment
source /cluster/bin/jobsetup

# Check if we have enough input arguments
if [ $# -lt 2 ]; then
  # No, print a usage message and exit script
 echo echo -e "\nUsage: sbatch submitscript.sh program dataset\n"
 exit

fi

#Load matlab module

module load matlab

#Make a result directory

RESULT=$SUBMITDIR/Result_`date +%d_%b_%H.%M.%S` mkdir -p $RESULT

#Save the name of the matlab program and the dataset

PROG=$1 DATA=$2
# Copy files to work directory:

cp $SUBMITDIR/$PROG $SCRATCH cp $SUBMITDIR/$DATA $SCRATCH

#Change directory to the work directory

cd $SCRATCH

#Start matlab

matlab -nodisplay -nojvm -nodesktop -nosplash < $PROG

#Copy the necessary files from $SCRATCH to the Result directory

cp $SCRATCH/*.txt $RESULT cp $SCRATCH/*.eps $RESULT

#End of script