The UB-HPC cluster consists of a heterogeneous set of nodes purchased over the years as grants were secured and funds became available. As of June 2022, it also includes the nodes from the industry cluster. We provide a list below of the types of nodes and rough quantity numbers. Keep in mind the older hardware is removed from service as it becomes impossible to repair so the quantities listed here may not be accurate.
NOTE: Not all nodes are available in all partitions or to all users. You can get an exact accounting of what is currently available in the UB-HPC cluster by running the 'snodes' command on the command line. More info on snodes
Type of Node | Qty | Cores /Node | Clock Rate | RAM | High Speed Network* | Slurm Features | Local /scratch | CPU/GPU Details |
Compute (Dell) | 67 | 56 | 2.0GHz | 512GB | Infiniband (HDR) | IB CPU-Gold-6330 INTEL | 875GB | Intel Xeon Gold 6330 (2/node) |
Compute (Dell) | 96 | 40 | 2.10GHz | 192GB | Infniband (M) | IB CPU-Gold-6230 INTEL NIH | 835GB | Intel Xeon Gold 6230 (2/node) |
Compute (Dell) | 86 | 32 | 2.10GHz | 192GB | OmniPath (OPA) | OPA CPU-Gold-6130 INTEL MRI | 827GB | Intel Xeon Gold 6130 (2/node) |
Compute (Dell) | 34 | 16 | 2.20GHz | 128GB | Infiniband (M) | IB CPU-E5-2660 INTEL | 7TB | Intel Xeon E5-2660 (2/node) |
Compute (Dell) | 372 | 12 | 2.40GHz | 48GB | Infiniband (QL) | IB CPU-E5645 INTEL | 884GB | Intel Xeon E5645 (2/node) |
Compute (Dell) | 128 | 8 | 2.13GHz | 24GB | None | CPU-L5630 INTEL | 268GB | Intel Xeon L5630 (2/node) |
High Memory Compute | 16 | 56 | 2.0GHz | 1TB | Infiniband (HDR) | IB CPU-Gold-6330 INTEL | 750GB | Intel Xeon Gold 6330 (2/node) |
High Memory Compute | 24 | 40 | 2.10GHz | 768GB | Infiniband (M) | IB CPU-Gold-6230 INTEL NIH | 3.5TB | Intel Xeon Gold 6230 (2/node), |
High Memory Compute | 16 | 32 | 2.10GHz | 768GB | OmniPath (OPA) | OPA CPU-Gold-6130 INTEL MRI | 3.5TB | Intel Xeon Gold 6130 (2/node) |
Compute Large Scratch | 1 | 32 | 2.0GHz | 256GB | Infiniband (QL) | IB CPU-X7550 INTEL | 1.3TB | Intel Xeon X7550` |
Compute Large Scratch | 8 | 32 | 2.13GHz | 256GB | Infiniband (QL) | IB CPU-E7-4830 INTEL | 3.1TB | Intel Xeon E7-4830 (4/node) |
AMD Compute Large Scratch | 8 | 32 | 2.20GHz | 256GB | Infiniband (QL) | IB CPU-6132HE AMD | 3.1TB | AMD Opteron 6132HE (4/node) |
High Memory Compute | 2 | 32 | 2.13GHz | 512GB | Infiniband (QL) | IB CPU-E7-4830 INTEL | 3.1TB | Intel Xeon E7-4830 (4/node) |
GPU Compute | 16 | 56 | 2.0GHz | 512GB | Infiniband (HDR) | IB CPU-Gold-6330 INTEL | 875GB | Intel Xeon Gold 6330 (2/node) |
GPU Compute | 8 | 40 | 2.1GHz | 192GB | Infiniband (M) | IB CPU-Gold-6230 V100 INTEL NIH | 845GB | Intel Xeon Gold 6230 (2/node), NVidia Tesla V100 16GB (2/node) |
GPU Compute | 16 | 32 | 2.1GHz | 192GB | OmniPath (OPA) | OPA CPU-Gold-6130 INTEL V100 MRI | 827GB | Intel Xeon Gold 6130 (2/node), NVidia Tesla V100 32GB(2/node) |
Academic Partitions - Detailed Hardware Specs by Node Type
Industry Partition - Hardware Specs
How To Request Specific Hardware When Running Slurm Jobs
Using snodes command to see what's available