SPH With Inter-dependent Fine-grained Tasking

Public Information

What is SWIFT?

SWIFT is a hydrodynamics and gravity code for astrophysics and cosmology. What does that even mean? It is a computer program designed for running on supercomputers that simulates forces upon matter due to two main things: gravity and hydrodynamics (forces that arise from fluids such as viscosity). The creation and evolution of stars and black holes is also modelled together with the effects they have on their surroundings. This turns out to be quite a complicated problem as we can’t build computers large enough to simulate everything down to the level of individual atoms. This implies that we need to re-think the equations that describe the matter components and how they interact with each other. In practice, we must solve the equations that describe these problems numerically, which requires a lot of computing power and fast computer codes.

We use SWIFT to run simulations of astrophysical objects, such as planets, galaxies, or even the whole universe. We do this to test theories about what the universe is made of and how it evolved from the Big Bang up to the present day!

Cosmological simulations

The formation of the Perseus cluster of galaxies recreated within the SIBELIUS-DARK simulations.

Planetary simulations

A simulation of a plausible formation scenario for the moon as the result of an impact during the early days of our solar system.

Why create SWIFT?

We created SWIFT for a number of reasons. The primary reason being that we want to be able to simulate a whole universe! This has been done before successfully (see the EAGLE Project for more details), but that simulation used a software which is not tailored for the newest supercomputers and took almost 50 days on a very large computer to complete. SWIFT aims to remedy that by choosing to parallelise the problem in a different way, by using better algorithms and by having a more modular structure than other codes making it easier for users to pick and choose what physical models they want to include in their simulations. This lets us also study very different topics like the giant impacts of planets colliding in the early days of the solar system.

The way that supercomputers are built is not by having one huge super-fast ‘computer’, but rather by having lots of regular computers (only a tiny bit better than what is available at home!) that are connected together by high-speed networks. Therefore, the way to speed up your code might not necessarily be to make it ‘run faster’ on a single machine, but rather enable those machines to talk to each other in a more efficient way. This is how SWIFT is different from other codes that are used in astrophysics for a similar purpose: the focus is on distributing the work to be done (the equations to be solved) in the best possible way across all the small computers that are part of a supercomputer.

Traditionally, you have each ‘node’ (computer) in the ‘cluster’ (supercomputer) running the exact same code at the exact same time, and at the end of each bit of the problem they all talk to each other and exchange information. SWIFT does this a little differently, with each node working on different tasks than others as and when those tasks need to be completed. SWIFT also makes the nodes communicate with each others all the time and not only at fixed points, allowing for much more flexibility. This cuts down on the time when a node is sitting and waiting for work, which is just wasted time, electricity, and ultimately money!

One other computer technology that occurred in the last decades is the appearance of so-called vector instructions. These allow one given computing core to process not just one number at a time (as in the past) but up to 8 (or even more on some machines!) in parallel. This means that a given compute core can solve the equations for 8 stars (for instance) at a time and not just one. However, exploiting this capability is hard and requires writing very detailed code. That is rarely done in other codes but our extra efforts pay off and SWIFT can solve the same equations as other software in significantly less time!

What is SPH?

Smoothed Particle Hydrodynamics (SPH) is a numerical method for approximating the forces between fluid elements (gas or liquids). Let’s say that we want to simulate some water and a wave within it. Even a single liter of water has 100000000000000000000000000 particles in it. To store that much data we would require a computer that as 100 trillion times as much storage space as all of the data on the internet. It’s clear that we need a more efficient way of simulating this water if we are to have any hope!

It turns out that we can represent the water by many fewer particles if we can smooth over the gaps between them efficiently. Smoothed Particle Hydrodynamics is the technique that we use to do that.

SPH was originally developed to solve problems in astrophysics but is now also a very popular tool in industry with applications that affect our everyday life. For instance, turbines are modelled with this technique to understand how to harvest as much energy from the wind. The method is also used to understand how waves and tsunamis affect the shores, allowing scientists to design effective defences for the population.

Astronomer

Want to get started using SWIFT? Check out the on-boarding guide available here. SWIFT can be used as a replacement for different codes and initial conditions in hdf5 format from commonly used generators can directly be read by SWIFT. All you then need is a parameter file adapted for SWIFT!

SWIFT combines multiple numerical methods that are briefly outlined here. The whole art is to efficiently couple them to exploit modern computer architectures.

Gravity

SWIFT uses the Fast Multipole Method (FMM) to calculate gravitational forces between nearby particles. These forces can be combined with long-range forces provided by a mesh that captures the periodic nature of the calculation. SWIFT currently uses a single fixed but time-variable softening length for all the particles.

As well as this self-gravity mode, we also make many useful external potentials available, such as galaxy haloes or stratified boxes that are used in idealised problems.

Besides softening, gravitational accuracy can be tuned through use of the adaptive opening angle and the choice of a multipole order for the short-range gravity calculation. The mesh forces are controlled by the cell size and frequency of the update.

Cosmology

SWIFT implements a standard LCDM cosmology background expansion and solves the equations in a comoving frame. We allow for equations of state of dark-energy that evolve with scale-factor. The structure of the code can easily allow for modified-gravity solvers or self-interacting dark matter schemes to be implemented. These will be part of future releases of the code.

Unlike other cosmological codes, SWIFT does not express quantities in units of the reduced Hubble parameter. This reduces the possible confusion created by this convention when using the data product but requires users to convert their initial conditions (using a specific mode of operation of SWIFT!) when taking them from a different code.

Hydrodynamics Schemes

There are many hydrodynamics schemes implemented in SWIFT, and SWIFT is designed such that it should be simple for users to add their own variations.

All the schemes can be combined with a time-step limiter inspired by the method of Durier & Dalla Vecchia 2012, which is necessary to ensure energy conservation in simulations that involve sudden injection of energy such as in feedback events.

The four main modes are as follows:

SPHENIX SPH

This is our default Smoothed Particle Hydrodynamics scheme. It is fully described by Borrow 2022. The core equations use a density-energy formulation of the equations of motion. This is combined with a variable artificial viscosity and conduction. These are accompanied by limiters to only apply these extra terms where they are necessary. This scheme was designed with galaxy formation applications in mind.

Minimal SPH

In this mode SWIFT uses the simplest energy-conserving SPH scheme that can be written with no viscosity switches nor thermal diffusion terms. It follows exactly the description in the review of the topic by Price 2012 and is not optimised. This mode is used for education purposes or can serve as a basis to help developers create other hydrodynamics schemes.

GADGET-2 SPH

SWIFT contains a ‘backwards-compatible’ GADGET-2 SPH mode, which uses a standard Monaghan 1977 artificial viscosity scheme with a Balsara switch. Note that the GADGET-2 SPH scheme is implemented to be the same as in the public release of GADGET-2. This is to enable users to use SWIFT as a drop-in replacement for GADGET-2 and of course for comparison exercises!

Subgrid models for galaxy formation

SWIFT implements two main models to study galaxy formation. These are available in the public repository and different components (star formation, cooling, feedback, etc.) can be mixed and matched for comparison purposes.

EAGLE model

The EAGLE model of galaxy formation is available in SWIFT. This combines the cooling of gas due to interaction with the UV and X-ray background radiation of Wiersma 2009, the star-formation method of Schaye 2008, the stellar evolution and gas enrichment model of Wiersma 2009, feedback from stars following Dalla Vecchia 2012, super-massive black-hole accretion following Rosas-Guevara 2015 and black-hole feedback following Booth 2009. All these modules have been ported from the Gadget-3 code to SWIFT and will hence behave slightly differently.

GEAR model

The GEAR model is available in SWIFT. This model uses the GRACKLE library for cooling and is one of the many models that are part of the AGORA comparison project.

The following movie shows a dwarf galaxy done with our model using the zoom in technique. The gas is shown in blue, the stars in yellow and the dark matter in orange. The dwarf galaxy corresponds to h159 in Revaz & Jablonka 2018.

Structure finder

SWIFT can be linked to the VELOCIraptor phase-space structure finder to return haloes and sub-haloes while the simulation is running. This on-the-fly processing allows for a much faster time-to-science than in the classic method of post-processing simulations after they are run.

Documentation and tests

There is a large amount of background reading material available in the theory directory provided with SWIFT.

SWIFT also provides a large library of hydrodynamical test cases for you to use, the results of which are available on our developer Wiki here.

Computer Scientist

Parallelisation strategy

SWIFT uses a hybrid MPI + threads parallelisation scheme with a modified version of the publicly available lightweight tasking library QuickSched as its backbone. Communications between compute nodes are scheduled by the library itself and use asynchronous calls to MPI to maximise the overlap between communication and computation. The domain decomposition itself is performed by splitting the graph of all the compute tasks, using the METIS library, to minimise the number of required MPI communications. The core calculations in SWIFT use hand-written SIMD intrinsics to process multiple particles in parallel and achieve maximal performance.

Strong and weak scaling

Cosmological simulations are typically very hard to scale to large numbers of cores, due to the fact that information is needed from each of the nodes to perform a given time-step. SWIFT uses smart domain decomposition, vectorisation, and asynchronous communication to provide a 7x speedup over the Gadget code when running a typical EAGLE-like cosmological simulation. It also presents near-perfect weak scaling even on the largest problems presented in the published astrophysics literature.

I/O performance

SWIFT uses the parallel-hdf5 library to read and write snapshots efficiently on distributed file systems. Through careful tuning of Lustre parameters, SWIFT can write snapshots at the maximal disk writing speed of a given system.