Exploring the dark universe with supercomputers: article in symmetry magazine

The other day, I accidentally came across the on-line magazine symmetry (yes, it’s lower case!) which posts articles in the fields of particle physics and related topics, such as dark matter, dark energy, neutrinos, all written for the educated layperson.  I was impressed at the consistently engaging quality of the writing, and recommend you take a look if, like me, you are interested in these areas but are not practicing in them. The magazine has a strong presence in the usual social media, and you can sign up for an e-mail summary too.

I was drawn to an article by Katie Elyce Jones, on “Exploring the dark universe with supercomputers,” in which she describes how  supercomputers will work in tandem with the next generation of survey telescopes, LSST and the Dark Energy Survey (DES), to explore the nature of dark energy.  The central issue here is whether dark energy acts as a repulsive force to counteract gravity, or whether there are other phenomena at work of which we currently have no knowledge. Simulations are requisite to the analysis because the nature of the dark energy is unknown, and they allow us to understand the effects of a particular physical model on the data. That is, they are the key predictive tool in next generation cosmological studies.

Now, it turns out that the effects of dark energy can only be seen between galaxies, and so massive simulations of the growth of structure in the Universe are needed to be useful in probing the nature of dark energy. Consequently, a team at the Argonne National Lab used the Hardware/Hybrid Accelerated Cosmology Code (HACC) to model the time-dependence  of trillions of interacting particles. HACC is the first cosmology code designed to run on a hybrid CPU/GPU supercomputer, as well as on multicore or many-core architectures. The HACC team recently completed a petascale model of the Universe over a period of 13 billion years, and will release it to researchers.

Models such as these will be combined with models of the telescopes and the atmosphere to understand how the observed sky will look. By changing the parameters in these models of simulated Universes, astronomers will be able to understand the effects of random and systematic errors in the data. A consequence of this approach is that the simulations will produce 10-100 times more data than the surveys themselves. It’s not just the observational data that require new approaches to managing large dynamic data sets!

A HACC simulation shows how matter is distributed in the universe over time.\ (Katrin Heitmann, et al., Argonne National Laboratory )

A HACC simulation shows how matter is distributed in the universe over time.
(Katrin Heitmann, et al., Argonne National Laboratory )

This entry was posted in astroinformatics, Astronomy, computer modeling, Computing, cosmology, cyberinfrastructure, dark energy, DES, galaxy formation, GPU's, Grid Computing, High Energy Physics, High performance computing, informatics, LSST, Parallelization, programming, Scientific computing, simulations, software engineering and tagged , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s