Difference between revisions of "5. Retrieve larger data sets from MARS"
m (→CREATING A SCRIPT) |
m (→CREATING A SCRIPT) |
||
Line 25: | Line 25: | ||
To retrieve large numbers of files from MARS create a '''ksh script''' on your home directory on ecgate. | To retrieve large numbers of files from MARS create a '''ksh script''' on your home directory on ecgate. | ||
− | (The default shell is either Korn (ksh) or C-shell (csh). | + | (The default shell is either Korn (ksh) or C-shell (csh)). |
Line 32: | Line 32: | ||
In the beginning of the script, set the Batch System keywords: | In the beginning of the script, set the Batch System keywords: | ||
− | #@ shell = /usr/bin/ksh | + | #@ shell = /usr/bin/ksh (Specify the shell) |
− | #@ job_type = serial | + | #@ job_type = serial (Indicates that this is a serial job) |
− | #@ job_name = request_to_MARS | + | #@ job_name = request_to_MARS (Name of job) |
− | #@ | + | #@ initialdir = /scratch/ms/no/sb9/test/ (Pathname to the initial working directory. OBS: Do not user environmental variables like $USER in these keywords!) |
− | #@ | + | #@ output = $(job_name).$(jobid).out (*.out file) |
− | #@ | + | #@ error = $(job_name).$(jobid).err (Error file) |
− | #@ class = normal | + | #@ class = normal (indicates the priority of the job, usually this is "normal") |
− | #@ notification = complete | + | #@ notification = complete (Sends notification on completion) |
− | #@ account_no = spnoflex | + | #@ notify_user = <userId>@ecmwf.int (change to your userID, by default your userID ) |
− | #@ queue | + | #@ account_no = spnoflex (FLEXPART account) |
+ | #@ queue (indicates the end of the keywords, mandatory) | ||
− | |||
− | |||
Revision as of 15:44, 12 March 2013
Back to ECMWF overview[1]
BATCH JOBS
The retrieval of data is done through a submission of a shell-script.
When running a program on the UNIX system use the batch system (not interactive mode). That means you submit the job with explicit commands so that the job is run unattended under Unix.
nohup is similarly submission of jobs to run unattended on a Unix system, but there exists more sophisticated batch systems for handling jobs.
The batch system currently available on ECgate and HPCF is called LoadLeveler and jobs are submitted with the command llsubmit.
OBS OBS! From June 2013 the batch system on ECgate will change from Loadleveler to SLURM and jobs are submitted with the command sbatch.
This page should be updated to contain the commands for the new batch system when it is up running.
CREATING A SCRIPT
Log in to ECgate.
To retrieve large numbers of files from MARS create a ksh script on your home directory on ecgate.
(The default shell is either Korn (ksh) or C-shell (csh)).
For a script example see the script from Sam-Erik:
/nilu/home/sec/ecmwf/ecmwf_starg_all.ksh
In the beginning of the script, set the Batch System keywords:
#@ shell = /usr/bin/ksh (Specify the shell) #@ job_type = serial (Indicates that this is a serial job) #@ job_name = request_to_MARS (Name of job) #@ initialdir = /scratch/ms/no/sb9/test/ (Pathname to the initial working directory. OBS: Do not user environmental variables like $USER in these keywords!) #@ output = $(job_name).$(jobid).out (*.out file) #@ error = $(job_name).$(jobid).err (Error file) #@ class = normal (indicates the priority of the job, usually this is "normal") #@ notification = complete (Sends notification on completion) #@ notify_user = <userId>@ecmwf.int (change to your userID, by default your userID ) #@ account_no = spnoflex (FLEXPART account) #@ queue (indicates the end of the keywords, mandatory)
Then add your request information which might look like this:
retrieve, class = od, ("Operational archive" stream = oper, ("operational Atmospheric model") expver = 1, ("Experiment version", always use 1) date = 1, ("Specifies the Analysis/Forecast base date", n is the number of days before today) time = 00:00, ("Specifies the Analysis/Forecast base time" step = 0/to/72/by/6, type = cf, levtype = pl, levelist = 100/150/200/250/300/400/500/700/850/925/1000, param = 129.128/130.128/131.128/132.128/133.128, grid = 0.5/0.5, area = 65/0/55/20, target = "ecmwf_data.grib" (Output file containing the data)
A summary of MARS keywords can be found here:
http://www.ecmwf.int/publications/manuals/mars/guide/MARS_keywords.html
To transfer files to zardoz add the following to your ksh script:
ectrans -remote my_ms_association@genericFtp \ -source ecmwf_data.grib \ -mailto user@nilu.no \ -onfailure \ -remove \ -verbose
Instead of the "retrieve" keyword in the above script you can use other keywords to "read", "compute" or "write" data.
As a rule - you should have as few retrieval routines as possible, but rather have each retrieval routine
SUBMIT YOUR JOB
Submit your job as a batch job to LoadLeveler.
To submit your script:
llsubmit myscript
MONITOR YOUR JOB
llq -u <UserId> To view where your job is in the queue
llq -l <jobId> To get a detailed description for a job
llq -s <jobId> To determine why the job is not running
llcancel <jobId>To cancel your script
See man llq for more options