Difference between revisions of "Submit R jobs on Abel"

From mn/geo/geoit
Jump to: navigation, search
(Created page with "<div class="vrtx-introduction"> This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at [http://www.uio.no/english/services/it/re...")
 
Line 6: Line 6:
 
For normal off-line non-parallell R running on any Linux server, you can simply do e.g.
 
For normal off-line non-parallell R running on any Linux server, you can simply do e.g.
 
<pre>module load R
 
<pre>module load R
</pre><pre>R CMD BATCH R_program.r
+
R CMD BATCH R_program.r
 
</pre>
 
</pre>
 
For large jobs, this script should set you up OK at the Abel cluster:
 
For large jobs, this script should set you up OK at the Abel cluster:
  
#!/bin/bash<br/># Job name:<br/>#SBATCH --job-name=R<br/>#<br/># Project (change to your&nbsp;:<br/>#SBATCH --account=geofag<br/>#<br/># Wall clock limit:<br/>#SBATCH --time=1000:0:0<br/>#<br/># Max memory usage per core (MB):<br/>#SBATCH --mem-per-cpu=2000M<br/>#<br/># Number of tasks (cores):<br/>#SBATCH --ntasks-per-cpu=12<br/>#
+
<pre>#!/bin/bash<br/># Job name:<br/>#SBATCH --job-name=R<br/>#<br/># Project (change to your&nbsp;:<br/>#SBATCH --account=geofag<br/>#<br/># Wall clock limit:<br/>#SBATCH --time=1000:0:0<br/>#<br/># Max memory usage per core (MB):<br/>#SBATCH --mem-per-cpu=2000M<br/>#<br/># Number of tasks (cores):<br/>#SBATCH --ntasks-per-cpu=12<br/>#
  
## Setup job environment<br/>source /cluster/bin/jobsetup
+
##Setup job environment<br/>source /cluster/bin/jobsetup
  
 
#Load R module
 
#Load R module
Line 19: Line 19:
  
 
export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1">&nbsp;</span>THREADS=12
 
export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1">&nbsp;</span>THREADS=12
<pre>module load R openmpi.gnu</pre><pre>mpirun -np 1 R CMD BATCH --no-save --no-restore R_program.r
+
module load R openmpi.gnu</pre><pre>mpirun -np 1 R CMD BATCH --no-save --no-restore R_program.r
  
#End of script
+
#End of script</pre>
</pre>
+
 
[[Category:Tools]]<br/>[[Category:Software]]<br/>[[Category:R]]<br/>[[Category:Abel]]
+
[[Category:Tools]] [[Category:Software]] [[Category:R]] [[Category:Abel]]

Revision as of 11:01, 27 April 2015

This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at UiOs HPC pages


For normal off-line non-parallell R running on any Linux server, you can simply do e.g.

module load R
R CMD BATCH R_program.r

For large jobs, this script should set you up OK at the Abel cluster:

#!/bin/bash<br/># Job name:<br/>#SBATCH --job-name=R<br/>#<br/># Project (change to your :<br/>#SBATCH --account=geofag<br/>#<br/># Wall clock limit:<br/>#SBATCH --time=1000:0:0<br/>#<br/># Max memory usage per core (MB):<br/>#SBATCH --mem-per-cpu=2000M<br/>#<br/># Number of tasks (cores):<br/>#SBATCH --ntasks-per-cpu=12<br/>#

##Setup job environment<br/>source /cluster/bin/jobsetup

#Load R module

module load R<br/><br/>#Set variable with number of processors and load mpi

export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1"> </span>THREADS=12
module load R openmpi.gnu
mpirun -np 1 R CMD BATCH --no-save --no-restore R_program.r
  1. End of script