Computational Modelling Group

Iridis

The University of Southampton Supercomputing facility is called Iridis. Currently the University is running two Iridis supercomputers at the same time making it one of the top supercomputing sites in the UK. The facility is open to research students and members of academic staff from any Faculty, who has the need for compute resources substantially greater than a standard PC. In addition, we have a dedicated Lyceum cluster intended mainly for undergraduate and MSc project work.

Iridis 4

Iridis 4 hardware

The current Iridis 4 is our forth generation cluster and here is list of its TOPs:

Based on November 2012 TOP500 Supercomputers issue it is ...

  • the most powerful academic supercomputer in England;
  • the second largest academic computational facility in the UK behind National Facility;
  • in Top 15 academic computational facilities in Europe;
  • in Top 30 academic computational facilities in the World;

Technical specifications for Iridis4

  • 12200 Cores (250 TFlops);
  • 16*2.6 GHz cores per node;
  • 4 GB of memory per core;
  • 4 high-memory nodes with 256 GB of RAM;
  • 24 Intel Xeon Phi Accelerators (25 TFlops);
  • 1.04 PB of storage with Parallel File System;
  • Infiniband network for interprocess communication;

See also Press release 15 October 2013 and video about Iridis 4.

Recommended hashtag: #iridis4

Iridis 3

Iridis 3 hardware

Iridis 3 is a previous generation of Iridis launched in 2010, when it was ranked 74th in the world in the
TOP500 list. In 2012 Iridis 3 underwent major upgrade, as a result number of processor-cores in Iridis 3 was increased by nearly 50%.

Technical specifications for Iridis 3

  • more then 11760 processor-cores providing over 105 TFlops peak;
  • more then 1000 nodes with total of 22.4 TB of RAM;
  • Infiniband network;
  • IBM General Parallel File System (GPFS), giving around 240 TB of usable storage;
  • Two 6-core 2.4 GHz Intel Westmere processors, a total of 12 processor cores per node;
  • Approximately 22 GB of usable memory per node (the remaining memory is used to store the OS as the nodes are stateless);
  • 32 high-memory compute nodes, each with two 4-core 2.27 GHz Nehalem processors and 45 GB of memory available to the users;
  • 15 GPU nodes, each with two NVIDIA Tesla 20-series, M2050, GPU processors (15 TFlops).

Further Information

The main source of information on the HPC service is CMG community pages, accessible to all University staff and postgraduate students. As well as documentation on how to access and use the service, it has information on training courses, background information on the facility, user forums and links to sources of further information.

Research outcome

You can get a flavour of kind of research topics enabled by Iridis from this list of scientific projects.