• eResearch
    • Collaborative Technologies
      • ARDC (Australian Research Data Commons)
      • ARDC Nectar Research Cloud
      • Australian Access Federation
      • QRIScloud
      • Video Collaboration
    • Data Management
      • Research Data Management Plans
    • Data Services
      • Australian and International Data Portals
      • CQUni Research Data Storage Options
      • CQUniversity Research Data Storage
      • GEOSCIENCE DATA PORTALS
    • eResearch and Security: MFA and CyberSafety
      • Encrypting Data on Portable Devices
    • High Performance Computing
      • The History of CQU’s HPC Facilities
        • Ada Lovelace Cluster (New HPC)
        • Marie Curie Cluster (Current HPC)
        • Einstein Cluster (Decommissioned)
        • Isaac Newton HPC Facility (Decommissioned)
      • HPC User Guides and FAQs
        • Basics of working on the HPC
        • Getting started on CQUniversity’s Ada Lovelace HPC System
        • Graphical Connection to the HPC System
        • Compiling Programs (and using the optimization flags)
        • Connecting to the Marie Curie Cluster
        • Finding Installed Software
        • Frequently Asked Questions
        • Graphical connection HPC via Open On Demand
        • HPC Job Scheduler
        • HPC Trouble Shooting
        • Machine and Deep Learning
        • PBS Commands
        • PBS to Slurm Command tables (HPC Scheduler)
        • Running LLM’s on the HPC system
        • Running Python on HPC
        • Simple Unix Commands
        • Software Module Information
        • Submitting an Interactive Job
        • Transferring Files to the HPC System
        • Transferring Files to the HPC System (Ada)
        • Using Abaqus
        • Using ANSYS (Fluent) on the HPC System
        • Using APSIM
        • Using HPC Scheduler on Ada Lovelace Cluster
        • Using MATLAB
        • Using R
        • Virtualisation and Containers
      • HPC Community
      • HPC Related Links
      • HPC Sample Code Scripts
        • MATLAB Sample Scripts
        • Multiple Job Submission
        • Multiple Run Job Submission
        • PBS Job Array Submission
        • R Sample Scripts
        • Sample PBS Submission Script
        • Sample Slurm Submission Script
      • HPC Software
        • Mathematica Sample Scripts
    • Research Software
    • Scholarly Communication
    • Survey Tools
    • Training
      • QCIF – Queensland Cyber Infrastructure Foundation
      • Teaching Lab Skills for Scientific Computing

eResearch

R SAMPLE SCRIPTS

In order to submit a R job to the cluster, it is suggested to write a script file similar to the one below. Lines beginning with “##” represents comments.

The variable $PBS_O_WORKDIR indicates the directory where the PBS script file is located and launched from. Replace the example email address provided to your email address. Change the R Script File name to the name of the R file that you want to be executed on the cluster.

Note all “[…]” are variables that require defining.

Example R PBS Submission Script (/apps/samples/PBS/R.pbs)

###### Select resources #####
#PBS -N [Name of Job]
#PBS -l ncpus=[number of cpu's required, most likely 1]
#PBS -l mem=[amount of memory required]
#PBS -l walltime=[how long the job should run for - you may wish to remove this line]

#### Output File #####
#PBS -o $PBS_O_WORKDIR/[output (standard out) file name]

#### Error File #####
#PBS -e $PBS_O_WORKDIR/[input (standard out) file name]

##### Queue #####
#pbs -q workq

##### Mail Options #####
#PBS -m abe
#PBS -M [your email address]

##### Change to current working directory #####
cd $PBS_O_WORKDIR

##### Execute Program #####
R --vanilla < [Your R file].R > [R output file name]

Real Example

###### Select resources #####
#PBS -N R-Job1
#PBS -l ncpus=1
#PBS -l mem=1g

#### Output File #####
#PBS -o $PBS_O_WORKDIR/R-job1.out

#### Error File #####
#PBS -e $PBS_O_WORKDIR/R-job1.err

##### Queue #####
#pbs -q workq

##### Mail Options #####
#PBS -m abe #PBS -M j.bell@cqu.edu.au

##### Change to current working directory #####
cd $PBS_O_WORKDIR

##### Execute Program #####
R --vanilla < input.R > results

Executing script on the cluster

The Einstein Cluster uses a job scheduler that allows you to schedule and run jobs on the various compute nodes. To submit a job, simply execute the command:

qsub [pbs_script_file]

A handy command, to check if your job is running, queued or completed is by using the command:

qstat -an

Support

eresearch@cqu.edu.au

tasac@cqu.edu.au OR 1300 666 620

Hacky Hour (3pm – 4pm every Tuesday)

High Performance Computing Teams site