Difference between revisions of "Submit R jobs on Abel"
From mn/geo/geoit
(Created page with "<div class="vrtx-introduction"> This is an example of a Matlab submit script for use on the Abel cluster. Abel usage is documented at [http://www.uio.no/english/services/it/re...") |
|||
(4 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
<div class="vrtx-introduction"> | <div class="vrtx-introduction"> | ||
− | This is an example of a | + | This is an example of a R submit script for use on the Abel cluster. Abel usage is documented at [http://www.uio.no/english/services/it/research/hpc/abel/ UiOs HPC pages] |
Line 6: | Line 6: | ||
For normal off-line non-parallell R running on any Linux server, you can simply do e.g. | For normal off-line non-parallell R running on any Linux server, you can simply do e.g. | ||
<pre>module load R | <pre>module load R | ||
− | + | R CMD BATCH R_program.r | |
</pre> | </pre> | ||
For large jobs, this script should set you up OK at the Abel cluster: | For large jobs, this script should set you up OK at the Abel cluster: | ||
+ | <pre>#!/bin/bash<br/># Job name:<br/>#SBATCH --job-name=R | ||
+ | # | ||
+ | # Project (change to your : | ||
+ | #SBATCH --account=geofag | ||
+ | # | ||
+ | # Wall clock limit: | ||
+ | #SBATCH --time=1000:0:0 | ||
+ | # | ||
+ | # Max memory usage per core (MB): | ||
+ | #SBATCH --mem-per-cpu=2000M | ||
+ | # | ||
+ | # Number of tasks (cores): | ||
+ | #SBATCH --ntasks-per-cpu=12 | ||
+ | # | ||
− | # | + | ##Setup job environment |
+ | source /cluster/bin/jobsetup | ||
− | # | + | #Load R module |
− | + | module load R | |
− | + | #Set variable with number of processors and load mpi | |
export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1"> </span>THREADS=12 | export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1"> </span>THREADS=12 | ||
− | + | module load R openmpi.gnu | |
+ | mpirun -np 1 R CMD BATCH --no-save --no-restore R_program.r | ||
#End of script | #End of script | ||
− | </pre> | + | </pre><br/><br/><br/> |
[[Category:Tools]]<br/>[[Category:Software]]<br/>[[Category:R]]<br/>[[Category:Abel]] | [[Category:Tools]]<br/>[[Category:Software]]<br/>[[Category:R]]<br/>[[Category:Abel]] |
Latest revision as of 12:48, 27 April 2015
This is an example of a R submit script for use on the Abel cluster. Abel usage is documented at UiOs HPC pages
For normal off-line non-parallell R running on any Linux server, you can simply do e.g.
module load R R CMD BATCH R_program.r
For large jobs, this script should set you up OK at the Abel cluster:
#!/bin/bash<br/># Job name:<br/>#SBATCH --job-name=R # # Project (change to your : #SBATCH --account=geofag # # Wall clock limit: #SBATCH --time=1000:0:0 # # Max memory usage per core (MB): #SBATCH --mem-per-cpu=2000M # # Number of tasks (cores): #SBATCH --ntasks-per-cpu=12 # ##Setup job environment source /cluster/bin/jobsetup #Load R module module load R #Set variable with number of processors and load mpi export OMP_NUM_<span id="cke_bm_104E" style="display: none" data-cke-bookmark="1"> </span>THREADS=12 module load R openmpi.gnu mpirun -np 1 R CMD BATCH --no-save --no-restore R_program.r #End of script