We are pleased to announce that eResearch will be running an introduction to the Griffith High Performance Computer (HPC) workshop.
At this stage we will be running this 2-part workshop as face-to face sessions at the Nathan campus for 6 hours split over 2 days, i.e. 3 hours each day but in the unlikelihood of new COVID restrictions, we may have to run the workshop online.
This workshop will introduce the fundamentals needed to use Griffith's HPC, including the basic bash and PBS scheduler syntax. By the end of these sessions you will know how to load data and R or python scripts and execute them.
Are you running analysis on your computer that is taking far to long?
Is your analysis using all of your computers memory and storage?
Has someone recommended that you should use a HPC or cloud to run your analysis?
Yes? .... Then you have come to the right place!!! Griffith University has an in-house High Performance Computing Environment (HPC) avaliable to all Griffith Researchers and HDR students.
HPC are supercomputers that can be used to run analysis at a fraction of the time it would take on your personal computer. The first HPC was developed in 1976, now, they are integral pieces of infrastructure found at most Universities and throughout Industry and government. This tutorial is specific to the Griffith HPC, however, many other HPC's work in a very similar fashion.
Please note that this workshop is capped at 20 participants.