High Performance and Parallel Computing Office of Information Technology

Easley Cluster (2020)

Dell PowerEdge HPC Cluster

15,000 Cores | 70 TB RAM | 4 PB Storage Space | ~1000 TFlops


Easley User's Guide

Cluster Components

HEAD AND LOGIN NODES

The head and login nodes are Dell PowerEdge R640 servers, with the following specifications:

  • 2x Intel Xeon Gold 6248R V3 "Sky Lake" CPUs (24 cores/CPU)
  • 192GB of memory
  • 1TB local SSD

Operating System: CentOS 7.9

COMPUTE NODES

393 Compute Nodes

There are multiple classes of compute nodes in this cluster:

  • 151 "Standard" nodes. Each node has 192GB of memory and 2x Intel Xeon Gold 6248R @ 3.0GHz CPUs (24 cores/CPU)
  • 1 "Super" node with 2,016GB RAM and 4x Intel Xeon E7-8890 @ 2.5GHz CPUs (18 cores/CPU)
  • 23 Type I Large Memory nodes with 384GB RAM
  • 11 Type II Large Memory nodes with 768GB RAM
  • 9 Type I GPU nodes with 2x NVIDIA Tesla T4 GPUs
  • 2 Type II GPU with 4x NVIDIA Tesla T4 GPUs
  • 13 AMD Nodes with 2x AMD EPYC 7662 64-Core Processor (64 cores/CPU)

In addition, 183 "Nova" nodes of the Lenovo System X series migrated over from AUHPC's prior Hopper cluster, featuring a range of hardware configurations:

  • CPU: Intel Xeon E5/E7 series (2.0-3.2GHz, 16-32 cores)
  • Memory: 128GB-1TB
  • GPU: 2 nodes with 2x NVIDIA Tesla K80 cards

Operating System: CentOS 7.9

STORAGE NODES

4x Dell PowerEdge R440 servers:

  • 2x management nodes
  • 2x storage nodes for GPFS
  • Intel Xeon Gold 5220 CPU @ 2.2GHz
  • Each node has 192GB RAM

Dell SAN

Dell ME4084 Storage Controllers:

  • 4PB of disk presented to all nodes via GPFS
  • All nodes are InfiniBand EDR connected

HPC LINKS

For more Information: Send your email request to hpcadmin@auburn.edu

Last Updated: October 24th, 2025