Iridis research computing facility

About the Iridis research computing facility

Iridis, the University’s High Performance Computing System, has dramatically increased computational performance and reach. Iridis is currently in its fifth generation and remains one of the largest computational faculties in the UK. In 2017, Iridis 5 joined the elite of the world’s top 500 supercomputers and is over four times more powerful than its predecessor.

In 2024, we welcomed the sixth generation of Iridis (Iridis 6), featuring an impressive addition of over 26,000 CPU cores. This substantial upgrade effectively doubles the current computational capacity of the entire Iridis facility, marking a significant advancement in our computing capabilities and fostering innovation. Iridis 6 seamlessly replaced Iridis 4, which had just over 12,000 CPU cores, while coexisting with Iridis 5, forming a dynamic and versatile computing ecosystem. Additionally, the Iridis computing facility houses Iridis X, our high-performance GPU cluster, with 26 X 80GB Nvidia A100 graphic cards and 5760 AMD CPUs for batch computing.

Highlights list 

  • Iridis 5 comprises of 25,000+ processor cores, 74 GPU cards, as well as 2.2PB storage utilizing the IBM Spectrum Scale file system
  • Iridis 6 comprises of 26,000+ AMD CPU cores, alongside high memory and login nodes with up to 3TB of memory and 15TB of local storage each
  • Iridis X features a total of 5,760 AMD CPU cores and 26 NVIDIA A100 graphics cards, with two of the A100 cards funded by the School of Mathematical Sciences. Additionally, it includes 16 NVIDIA H100 SXM GPUs and 20 NVIDIA A100 SXM GPUs, which are owned by the School of Electronics and Computer Science and the Optoelectronics Research Centre, but can also be scavenged by other Iridis users
  • dedicated nodes for visualisation software applications
  • management of private research facilities including the School of Engineering’s deep learning computing cluster
  • dedicated research computing system engineers for user support and training with an inclusive HPC facility supporting both research and teaching activities at the University
  • high performance InfiniBand network infrastructure for high-speed data transfer

 

Access to Iridis

If you are a researcher or postgraduate student at the University of Southampton, and think that you have a good case for using Iridis, please complete the online application form.

Undergraduate and MSc students can access the Lyceum service which runs on the Iridis cluster. Project/Course tutors should fill out the Lyceum Account Application Form to request access for students.

Technical specification

Iridis 5

Compute nodes

  • 464 Intel compute nodes with dual 2.0 GHz Intel Xeon Gold 6138 processors
  • each Intel compute node has 40 CPUs with 192 GB of DDR4 memory while the AMD compute nodes have 64 CPUs with 256GB of DDR4 memory

Graphic cards

  • 20 NVIDIA Tesla V100 graphic cards, each with 16GB VRAM spread across 10 nodes (node specification: 40 CPUs with 192 GB of DDR4 memory)
  • 40 GTX1080Ti graphic cards, each with 12GB VRAM spread across 10 nodes (node specification: 28 CPUs with 128GB of DDR4 memory)

High Memory nodes

  • 16 high-memory nodes, each with 64 cores, 1.5TB of RAM and 20TB of SAS HDD.

Visualisation nodes, login nodes and more

  • 2 data visualisation nodes with 32 usable cores, 384 GB of RAM and an Nvidia M60 GPU. These run both windows and Linux VMs
  • 3 Intel login nodes, each with 40 CPU cores and 384 GB of memory
  • 1 AMD login node with 64 CPU cores and 512 GB of memory
  • in total more than 20,000 processor-cores providing 1,305 TFlops peak
  • InfiniBand HDR100 Interconnect for high-speed data transfer

Iridis 6

  • 25,000+ AMD CPU for batch computing
  • 4 high memory nodes, each with 3TB of Memory, 6.6TB of local storage and 192 AMD CPU cores
  • 3 Login nodes with 3TB of Memory, 15TB of local storage and 192 AMD CPU cores
  • InfiniBand HDR100 Interconnect for high-speed data transfer

Iridis X

  • 5,760 total AMD CPU for batch computing
  • 26 x 80GB NVIDIA A100 graphics cards total including 2x A100 funded by School of Mathematical Sciences
  • each A100 node contains 500GB memory and 48 CPU cores 
  • dedicated partition for the School of Electronics and Computer Science and Optoelectonics Research Centre (funded by the Wolfson Foundation), with specifications:
    • 16x 80GB H100 SXM NVIDIA GPUs
    • 20x 80GB A100 SXM NVIDIA GPUs
    • this hardware can be scavenged by general users
  • 1 login node with 4x Nvidia L4 24GB GPU, PCIe, 1TB memory, 64 CPU cores
  • 1 login node with 64 cores AMD CPU, 512GB memory
  • InfiniBand HDR100 Interconnect for high-speed data transfer

Contact us

Contact us

The University provides both administrative, training and software engineering support to unlock the full potential of our high performance computing clusters.
University of Southampton Research Data Centre

HPC Administrative team

University of Southampton Research Software Group

Useful links