RStudio submitting jobs to the Scientific Computing Linux Clusters

When working with RStudio on the Scientific Computing (SciC) Linux cluster, it is necessary to send those computations that require significant memory and / or CPU time to the cluster.  This prevents your R session from running out of memory and the RStudio server from being overloaded. Follow this guide to submit jobs to the cluster and pull the results back directly into an R session

Note: This example is based on an SciC Linux Clusters user experience with it, you can take a look to more details in his blog page.

Prepare the script template file

To submit batch R jobs to SciC Linux Clusters from inside an R environment you can use the following template. Be sure to note the full path to the file to be able to load it in R later.  We are calling the file  "lsfTemplate.tmpl". Replace "module load R/3.2.2" with the actual R module that you are using

## Default resources can be set in your .BatchJobs.R by defining the variable
## 'default.resources' as a named list.
## remove everthing in [] if your cluster does not support arrayjobs
#BSUB -J <%= job.name %>[1-<%= arrayjobs %>] # name of the job / array jobs
#BSUB -o <%= log.file %> # output is sent to logfile, stdout + stderr by default
#BSUB -q <%= resources$queue %> # Job queue
##BSUB -W <%= resources$walltime %> # Walltime in minutes
##BSUB -M <%= resources$memory %> # Memory requirements in Kbytes
# we merge R output with stdout from LSF, which gets then logged via -o option
module load R/3.2.2
Rscript –no-save –no-restore –verbose <%= rscript %> /dev/stdout

Obtain an R session

You can work within RStudio on the SciC Linux Clusters

OR you can start an interactive session on a compute node and run command-line R within that session

 bsub -Is -R 'rusage[mem=10000]' -n 4 /bin/bash
...
R

Submitting tasks from the R session

In the R session you can start implementing the the first function that reads in the LSF template and sets up the configuration for the scheduler in the environment.

cluster.functions <- makeClusterFunctionsLSF(“/full_path_to_template/lsfTemplate.tmpl”)

Next is to create a registry for which you need 3 pieces of information

  1. the Id : Project Id rather than a job id 
  2. the file.dir : Output directory 
  3. src.dirs: Any .R files in this directory will be “sourced” in the bsub job
reg <- makeRegistry(id=”test_boot_reg”, seed=123,
file.dir=”/path_to_output_dir/”,
src.dirs=”path_to_source_dir”)

The next function is the batchMap function, the .lsf scripts to submit jobs. You need to pass the registry, a function that you want to run,  a vector to split over and any additional arguments that you want to pass to the function, for example:

batchMap(reg,bootRun,seq(from=10,by=120,length.out=10),
more.args=list(indat=thresh.wide.dat,nBoot=nBoot, nSamp=nSamp, nCol=nCol))

You can wrap almost anything within the function, but you need to specify a vector or list after the function.

If you wanted to pass a large job of using a for loop to bootstrap a dataset 10000 times. To submit this a single job to the cluster with the batchMap function. If the dataset is used as the main argument it gets vectorized, so the first argument should be a dummy variable and pass it a single value

exampleBoot<- function(a, indat, nBoot) { bootstrap using for loop here and return the result}

and then call batchMap like below

batchMap(reg,exampleBoot,0,more.args=list(indat=thresh.wide.dat,nBoot=nBoot))

now the above generates code to submit one job to the cluster. To see the jobIds use getJobIds(reg)

Finally, submit the jobs using the command below and any additional bsub options can be passed using the resources argument.

submitJobs(reg,resources=list(queue=”medium”), progressbar=FALSE,max.retries=0)

NOTE: The value of max.retries has to be changed to 0 from the default as it generates an error.

Suppose you wanted to submit 10 jobs of a 100 jobs of a 100 bootstraps each that is exactly the first example first batchMap. Now you can provide a sequence for the vector and the function automatically creates a job for each element of the vector. This needs to use the dummy variable and pass a sequence of seeds. Also you have to specify the number of bootstraps I want in my function which I pass as additional arguments

batchMap(reg,bootRun,seq(from=10,by=120,length.out=10),more.args=list(indat=thresh.wide.dat,nBoot=nBoot, nSamp=nSamp, nCol=nCol))

The tremendous advantage  of this whole process  is that you are still in the interactive session.

Other useful functions are

showStatus(reg)
res1 <- loadResult(reg,1)
Go to KB0028043 in the IS Service Desk

Related articles