Friday, September 28, 2012

Meet Mira, the Supercomputer That Makes Universes


Cosmology is the most ambitious of sciences. Its goal, plainly stated, is to describe the origin, evolution, and structure of the entire universe, a universe that is as enormous as it is ancient. Surprisingly, figuring out what the universe used to look like is the easy part of cosmology. If you point a sensitive telescope at a dark corner of the sky, and run a long exposure, you can catch photons from the young universe, photons that first sprang out into intergalactic space more than ten billion years ago. Collect enough of these ancient glimmers and you get a snapshot of the primordial cosmos, a rough picture of the first galaxies that formed after the Big Bang. Thanks to sky-mapping projects like the Sloan Digital Sky Survey, we also know quite a bit about the structure of the current universe. We know that it has expanded into a vast web of galaxies, strung together in clumps and filaments, with gigantic voids in between.

The real challenge for cosmology is figuring out exactly what happened to those first nascent galaxies. Our telescopes don't let us watch them in time-lapse; we can't fast forward our images of the young universe. Instead, cosmologists must craft mathematical narratives that explain why some of those galaxies flew apart from one another, while others merged and fell into the enormous clusters and filaments that we see around us today. Even when cosmologists manage to cobble together a plausible such story, they find it difficult to check their work. If you can't see a galaxy at every stage of its evolution, how do you make sure your story about it matches up with reality? How do you follow a galaxy through nearly all of time? Thanks to the astonishing computational power of supercomputers, a solution to this problem is beginning to emerge: You build a new universe.

In October, the world's third fastest supercomputer, Mira, is scheduled to run the largest, most complex universe simulation ever attempted. The simulation will cram more than 12 billion years worth of cosmic evolution into just two weeks, tracking trillions of particles as they slowly coalesce into the web-like structure that defines our universe on a large scale. Cosmic simulations have been around for decades, but the technology needed to run a trillion-particle simulation only recently became available. Thanks to Moore's Law, that technology is getting better every year. If Moore's Law holds, the supercomputers of the late 2010s will be a thousand times more powerful than Mira and her peers. That means computational cosmologists will be able to run more simulations at faster speeds and higher resolutions. The virtual universes they create will become the testing ground for our most sophisticated ideas about the cosmos.

Salman Habib is a senior physicist at the Argonne National Laboratory and the leader of the research team working with Mira to create simulations of the universe. Last week, I talked to Habib about cosmology, supercomputing, and what Mira might tell us about the enormous cosmic web we find ourselves in.

Help me get a handle on how your project is going to work. As I understand it, you're going to create a computer simulation of the early universe just after the Big Bang, and in this simulation you will have trillions of virtual particles interacting with each other -- and with the laws of physics -- over a time period of more than 13 billion years. And once the simulation has run its course, you'll be looking to see if what comes out at the end resembles what we see with our telescopes. Is that right?

Habib: That's a good approximation of it. Our primary interest is large-scale structure formation throughout the universe and so we try to begin our simulations well after the Big Bang, and even well after the microwave background era. Let me explain why. We're not sure how to simulate the very beginning of the universe because the physics are very complicated and partially unknown, and even if we could, the early universe is structurally homogenous relative to the complexity that we see now, so you don't need a supercomputer to simulate it. Later on, at the time of the microwave background radiation, we have a much better idea about what's going on. WMAP andPlanck have given us a really clear picture of what the universe looked like at that time, but even then the universe is still very homogenous -- its density perturbations are something like one part in a hundred thousand. With that kind of homogeneity, you can still do the calculations and modeling without a supercomputer. But if you fast forward to the point where the universe is about a million times denser than it is now, that's when things get so complicated that you want to hand over the calculations to a supercomputer.

Now the trillions of particles we're talking about aren't supposed to be actual physical particles like protons or neutrons or whatever. Because these trillions of particles are meant to represent the entire universe, they are extremely massive, something in the range of a billion suns. We know the gravitational mechanics of how these particles interact, and so we evolve them forward to see what kind of densities and structure they produce, both as a result of gravity and the expansion of the universe. So, that's essentially what the simulation does: it takes an initial condition and moves it forward to the present to see if our ideas about structure formation in the universe are correct.

by Ross Andersen, The Atlantic |  Read more:
Photo:Argonne National Laboratory