Easley User’s Guide
Easley Cluster Quick Start Guide
Auburn University HPC Documentation
»
Auburn University HPC Documentation
View page source
Auburn University HPC Documentation
¶
Easley User’s Guide
Introduction
About the Easley Cluster
Easley Namesake
About this Manual
Acceptable Use
Citations and Acknowledgements
System Overview
Accessing Easley
Request an Account
Connect to Easley
Common Issues Connecting to Easley
Locations & Resources
Login Node
Compute Nodes
File Storage
Data Transfer
Job Scheduling & Resource Allocation
Job Submission
Job-Related Commands
Slurm Job States
Partitions
Monitor Jobs
Monitor Resources
Testing
Job Sub Examples
Job Arrays
Software
Finding Available Software
New Software
Set the Environment
GPUs
GPU Quick Start
Easley GPU Devices
Scheduling GPUs
GPU Overview
Easley Cluster Training: Fundamentals
Getting Started & Basic Linux Commands
Research Software
Job Scheduling
Job Submission
Next Steps
How to Get Help
Programming Environments
R Programming Environment
Python on AU HPC Systems
Python Virtual Environments
Python Fundamentals Training
Python Fundamentals Training
Introduction
Listing Available packages
Python or Anaconda?
Virtual Environments and installing packages locally
Python Concurrency and Parallelism
Multithreading with Python
Multiprocessing with Python
Job Submission
Additional Services
Data Transfer
SCP(Linux and Mac users)
WINSCP(Windows Users)
GLOBUS
GPU Workloads
Introduction
GPU Modules & Locations
Basic CUDA C++ Example
Using Cuda with Pytorch
Installing Pytorch
torch.cuda
Batch Job Submission
MPI Workloads
Containers and HPC
Singularity or Docker?
Benefits of using containers
Popular container registries
Pulling Containers
Run and exec
Job Submission
Building an Image
Easley Cluster Quick Start Guide
Connect to the Easley Cluster
Running Batch Jobs
Checking Job Status
Viewing Available Partitions (Queues)
Viewing Available Software
Loading Software
Indices and tables
¶
Index
Module Index
Search Page