Materials from past workshops can be found below. Please keep in mind that some of the information about available computing resources and cluster access may be outdated. For current information on how to access the cluster, please refer to our FAQ page or our Documentation section.
Current offerings: See our Spring Training page for recent training sessions and materials.
Parallel Computing: Parallel Computing using OpenMP slides
Next Steps: Running Biology Workflows on Odyssey slides
Parallel Computing: Introduction to Parallel Computing slides
Next Steps: Troubleshooting Jobs on Odyssey slides
Tips@12: Running ImageJ macros and jobs on Odyssey (github repo)
Tips on the 10s: Installing Python/Perl Modules and R Packages (slides)
RC Software Carpentry Workshop: Nov '14 materials
Introduction to Odyssey and RC Services (ongoing updates)
Below is a list of high performance computing resources from outside Harvard. If you know of other resources that might be helpful to the Research Computing community, please send links and suggestions to our communications coordinator Gabrielle Naglieri.
MATLAB Demos MATLAB demonstrations and examples from MathWorks
High Performance Computing with CUDA CUDA tutorial by NVIDIA
GPU Computing Webinars Webinars by NVIDIA
LLNL HPC Training Workshops and training resources offered by Lawrence Livermore National Laboratory
SDSC HPC Training Workshops and training resources offered by the San Diego Supercomputer Center
NERSC Training Materials Classes, workshops, and training materials offered by the National Energy Research Scientific Computing Center
TACC Training Materials Classes, workshops, and training materials offered by the Texas Advanced Computing Center
OU Training Materials Classes, workshops, and training materials offered by the University of Oklahoma
Rutgers University Training Video tutorials on how to use MATLAB
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. Permissions beyond the scope of this license may be available at Attribution.