Generations of Supercomputers Pin Down Primordial Plasma
As one groundbreaking IBM system retires, a new Blue Gene supercomputer comes online at Brookhaven Lab to help precisely model subatomic interactions
March 31, 2014
Brookhaven Lab physicists Peter Petreczky and Chulwoo Jung with technology architect Joseph DePace—who oversees operations and maintenance of the Lab's supercomputers—in front of the Blue Gene/Q supercomputer.
Supercomputers are constantly evolving to meet the increasing complexity of calculations ranging from global climate models to cosmic inflation. The bigger the puzzle, the more scientists and engineers push the limits of technology forward. Imagine, then, the advances driven by scientists seeking the code behind our cosmos.
This mutual push and pull of basic science and technology plays out every day among physicists at the U.S. Department of Energy's Brookhaven National Laboratory. The Lab’s Lattice Gauge Theory Group—led by physicist Frithjof Karsch—hunts for equations to describe the early universe and the forces binding matter together. Their search spans generations of supercomputers and parallels studies of the primordial plasma discovered and explored at Brookhaven's Relativistic Heavy Ion Collider (RHIC).
"This is the beauty of pinning down fundamental interactions: the foundations of matter are literally universal. And only a few groups in the world are describing this particular aspect of our universe.”
— Brookhaven Lab physicist Peter Petreczky
"You need more than just pen and paper to recreate the quantum-scale chemistry unfolding at the foundations of matter—you need supercomputers," said Brookhaven Lab physicist Peter Petreczky. "The racks of IBM’s Blue Gene/L hosted here just retired after six groundbreaking years, but the cutting-edge Blue Gene/Q is now online to keep pushing nuclear physics forward. "
Equations to Describe the Dawn of Time
When RHIC smashes gold ions together at nearly the speed of light, the trillion-degree collisions melt the protons inside each atom. The quarks and gluons inside then break free for a fraction of a second, mirroring the ultra-hot conditions of the universe just microseconds after the Big Bang. This remarkable matter, called quark-gluon plasma, surprised physicists by exhibiting zero viscosity—it behaved like a perfect, friction-free liquid. But this raised new questions: how and why?
Armed with the right equations of state, scientists can begin to answer that question and model that perfect plasma at each instant. This very real quest revolves in part around the very artificial: computer simulations.
“If our equations are accurate, the laws of physics hold up through the simulations and we gain a new and nuanced vocabulary to characterize and predict truly fundamental interactions,” Karsch said. “If we're wrong, the simulation produces something very different from reality. We’re in the business of systematically eliminating uncertainties.”
Building a Quantum Grid
Quantum chromodynamics (QCD) is the theoretical framework that describes these particle interactions on the subatomic scale. But even the most sophisticated computer can't replicate the full QCD complexity that plays out in reality.
"To split that sea of information into discrete pieces, physicists developed a four-dimensional grid of space-time points called the lattice," Petreczky said. "We increase the density of this lattice as technology evolves, because the closer we pack our lattice-bound particles, the closer we approximate reality."
Imagine a laser grid projected into a smoke-filled room, transforming that swirling air into individual squares. Each intersection in that grid represents a data point that can be used to simulate the flow of the actual smoke. In fact, scientists use this same lattice-based approximation in fields as diverse as climate science and nuclear fusion.
As QCD scientists incorporated more and more subatomic details into an ever-denser grid—including the full range of quark and gluon types—the mathematical demands leapt exponentially.
QCD on a Chip
Physicist Norman Christ, a Columbia University professor and frequent Brookhaven Lab collaborator, partnered with supercomputing powerhouse IBM to tackle the unprecedented hardware challenge for QCD simulations. The new system would need a relatively small physical footprint, good temperature control, and a combination of low power and high processor density.
The result was the groundbreaking QCDOC, or QuantumChromoDynamics On a Chip. QCDOC came online in 2004 with a processing power of 10 teraflops, or 10 trillion floating operations per second, a common performance standard.
"The specific needs of Christ and his collaborators actually revolutionized and rejuvenated supercomputing in this country," said physicist Berndt Mueller, who leads Brookhaven Lab’s Nuclear and Particle Physics directorate. "The new architecture developed for QCD simulations was driven by these fundamental physics questions. That group laid the foundation for generations of IBM supercomputers that routinely rank among the world's most powerful."
Generations of Giants
The first QCDOC simulations featured lattices with 16 points in each spatial direction—a strong starting point and testing ground for QCD hypotheses, but a far cry from definitive. Building on QCDOC, IBM launched its Blue Gene series of supercomputers. In fact, the chief architect for all three generations of these highly scalable, general-purpose machines was physicist Alan Gara, who did experimental work at Fermilab and CERN’s Large Hadron Collider before being recruited by IBM.
"We had the equation of state for quark-gluon plasma prepared for publication in 2007 based on QCDOC calculations," Petreczky said, "but it was not as accurate as we hoped. Additional work on the newly installed Blue Gene/L gave us confidence that we were on the right track."
The New York Blue system—led by Stony Brook University and Brookhaven Lab with funding from New York State—added 18 racks of Blue Gene/L and two racks of Blue Gene/P in 2007. This 100-teraflop boost doubled the QCD model density to 32 lattice points and ran simulations some 10 million times more complex. Throughout this period, lattice theorists also used Blue Gene supercomputers at DOE’s Argonne and Lawrence Livermore national labs.
The 600-teraflop Blue Gene/Q came online at Brookhaven Lab in 2013, packing the processing power of 18 racks of Blue Gene/P into just three racks. This new system signaled the end for Blue Gene/L, which went offline in January 2014. Both QCDOC and Blue Gene/Q were developed in close partnership with RIKEN, a leading Japanese research institution.
"Exciting as it is, moving across multiple systems is also a bit of a headache," Petreczky said. "Before we get to the scientific simulations, there's a long transition period and a tremendous amount of code writing. Chulwoo Jung, one of our group members, takes on a lot of that crucial coding.”
Pinning Down Fundamental Fabric
Current simulations of QCD matter feature 64 spatial lattice points in each direction, allowing physicist an unprecedented opportunity to map the quark-gluon plasma created at RHIC and explore the strong nuclear force. The Lattice Gauge Theory collaboration continues to run simulations and plans to extend the equations of state to cover all the energy levels achieved at both RHIC and the Large Hadron Collider at CERN.
The equations already ironed out by Brookhaven’s theorists apply to everything from RHIC's friction-free superfluid to physics beyond the standard model—including the surprising spin of muons in the g-2 experiment and rare meson decays at Fermilab.
"This is the beauty of pinning down fundamental interactions: the foundations of matter are literally universal," Petreczky said. "And only a few groups in the world are describing this particular aspect of our universe.”
Additional Brookhaven Lab lattice theorists include Michael Creutz, Christoph Lehner, Taku Izubuchi, Swagato Mukherjee, and Amarjit Soni.
The Brookhaven Computational Science Center (CSC) hosts the IBM Blue Gene supercomputers and Intel clusters used by scientists across the Lab. The CSC brings together researchers in biology, chemistry, physics and medicine with applied mathematicians and computer scientists to take advantage of the new opportunities for scientific discovery made possible by modern computers. The CSC is supported by DOE’s Office of Science.
DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
2014-4616 | INT/EXT | Newsroom