## Supercomputers Help Solve a 50-Year Homework Assignment

### Calculation related to question of why the universe is made of matter

September 26, 2013

Members of Brookhaven Lab's high-energy physics theory group who were involved in the kaon decay calculations Sitting, left to right: Christoph Lehner, Amarjit Soni, Taku Izubuchi, Christopher Kelly, Chulwoo Jung. Standing, left to right: Eigo Shintani, Hyung-Jin Kim, Ethan Neil, Taichi Kawanai, Tomomi Ishikawa

Kids everywhere grumble about homework. But their complaints will hold no water with a group of theoretical physicists who’ve spent almost 50 years solving one homework problem—a calculation of one type of subatomic particle decay aimed at helping to answer the question of why the early universe ended up with an excess of matter.

Without that excess, the matter and antimatter created in equal amounts in the Big Bang would have completely annihilated one another. Our universe would contain nothing but light—no homework, no schools…but also no people, or planets, or stars!

Physicists long ago figured out *something* must have happened to explain the imbalance—and our very existence.

"Our results will serve as a tough test for our current understanding of particle physics.”

— Brookhaven theoretical physicist Taku Izubuchi

“The fact that we have a universe made of matter strongly suggests that there is some violation of symmetry,” said Taku Izubuchi, a theoretical physicist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory.

The physicists call it charge conjugation-parity (CP) violation. Instead of everything in the universe behaving perfectly symmetrically, certain subatomic interactions happen differently if viewed in a mirror (violating parity) or when particles and their oppositely charged antiparticles swap each other (violating charge conjugation symmetry). Scientists at Brookhaven—James Cronin and Val Fitch—were the first to find evidence of such a symmetry “switch-up” in experiments conducted in 1964 at the Alternating Gradient Synchrotron, with additional evidence coming from experiments at CERN, the European Laboratory for Nuclear Research. Cronin and Fitch received the 1980 Nobel Prize in physics for this work.

Theoretical physicists and kaon-decay calculators Norman Christ, Robert Mawhinney (both of Columbia University), and Taku Izubuchi (of Brookhaven), holding one rack of the QCDOC supercomputer at Brookhaven, which was used for many of the earlier kaon calculations. It was replaced by QCDCQ in 2012.

What was observed was the decay of a subatomic particle known as a kaon into two other particles called pions. Kaons and pions (and many other particles as well) are composed of quarks. Understanding kaon decay in terms of its quark composition has posed a difficult problem for theoretical physicists.

“That was the homework assignment handed to theoretical physicists, to develop a theory to explain this kaon decay process—a mathematical description we could use to calculate how frequently it happens and whether or how much it could account for the matter-antimatter imbalance in the universe. Our results will serve as a tough test for our current understanding of particle physics,” Izubuchi said.

### Sophisticated computational tools

The mathematical equations of Quantum Chromodynamics, or QCD—the theory that describes how quarks and gluons interact—have a multitude of variables and possible values for those variables. So the scientists needed to wait for supercomputing capabilities to evolve before they could actually solve them. The physicists invented the complex algorithms and wrote nifty software packages that some of the world’s most powerful supercomputers used to describe the quarks’ behavior and solve the problem.

In the physicists’ software, the particles are “placed” on an imaginary four-dimensional space-time lattice consisting of three spatial dimensions plus time. At one end of the time dimension lies the kaon, made of two kinds of quarks—a “strange” quark and an “anti-down” quark—held together by gluons. At the opposite end, they place the end products, the four quarks that make up the two pions. Then the supercomputer computes how the kaon transforms into two pions as it flies through space and time. Conducting these computations on the lattice greatly simplifies the problem.

“We use the supercomputers to look at how each quark is flying—its velocity, direction—in other words, the dynamics of the strong QCD interaction,” Izubuchi said.

Somewhere in the middle of this complicated space-time grid, with some degree of probability, the strange quark of the kaon—which the strong force keeps strongly bound with its anti-down quark partner—suddenly starts to change into a down quark by the so-called electroweak interaction. Since a kaon is heavier than two pions, the energy released creates a new quark/anti-quark pair—an “up” and an “anti-up” quark—from the vacuum. These quarks then combine with the new down quark and the leftover anti-down quark to make the two pions.

“The experiments showed how frequently these ‘K→ππ’ processes happen, but the part that violates CP symmetry is the strange quark converting into a down quark through the weak interaction,” Izubuchi said. “That’s the part we really wanted to know more about to understand the strength of this CP violation. That information will give us a hint of why the universe is matter-rich, and/or confirm the correctness of our current understanding of particle physics.”

The supercomputers crunched tens of billions of numbers into the equation that describes this part of the process to find the result that should reproduce the decaying particle patterns and frequencies observed by the experiments.

“The result of the calculation tells us how frequently this CP-violating weak interaction occurs and the strength of the CP violation at the quark level,” Izubuchi said. “It’s a kind of reverse-engineering what experimenters have seen in kaon decays to solve the problem.”

### New algorithm, higher precision

After publishing their initial results in 2012, the physicists further improved their calculation to more closely simulate what happens with these particles in Nature. These new calculations allow them to directly compare their numbers with the experimental results more accurately, but they also increase the computational “cost” considerably—requiring more computing power/time. Even with the newest supercomputers, the homework would have taken many years if not for a new efficient algorithm developed by the Brookhaven group in late 2012.

“This new algorithm, called all-mode averaging (AMA), divides the whole calculation into a ‘difficult’ but small piece and an ‘easier’ large piece, and devotes more computation time to the latter part to save the total computation required,” Izubuchi said. “It accelerates the speed of the computations by a factor of ten or more. This very simple idea of dividing the calculation into two pieces actually helped to reduce the statistical error of the computation by a lot.”

### Do the numbers add up?

Is the calculated strength of the weak interaction strong enough to account for the matter antimatter asymmetry in the early universe?

“That’s the million-dollar question,” said Izubuchi. “So far people think this is not the full answer. We cannot explain why the universe is matter-rich based solely on the amount of CP violation that this kaon decay accounts for. So there may be other sources of CP violation other than the weak interaction that would be revealed if a discrepancy were found between our calculation and the experimental results.”

Then Izubuchi confessed that the theorists have only solved half of their homework problem.

“When we say we theoretically understood this process, it is only half true. There are two different ways the two end-result pions can combine with each other (called isospin states), and we’ve only solved the problem for one combination, the isospin 2 channel.”

The experiments have measurements for both isospin states, so the theorists are working on calculating the second process as well.

“The other, isospin 0, is more challenging, and we are getting there by employing the faster supercomputers and new theoretical ideas and computation algorithms. But, for now, we have finished half of 50 years’ homework.”

This research is part of DOE’s Scientific Discovery through Advanced Computing (SciDAC-3) program “Searching for Physics Beyond the Standard Model: Strongly-Coupled Field Theories at the Intensity and Energy Frontiers,” supported by the DOE Office of Science.

The supercomputing resources used for this research included: QCDCQ, a pre-commercial version of the IBM Blue Gene supercomputers, located at the RIKEN/BNL Research Center—a center funded by the Japanese RIKEN laboratory in a cooperative agreement with Brookhaven Lab; a Blue Gene/Q supercomputer of the New York State Center for Computational Science, hosted by Brookhaven; half a rack of an additional Blue Gene/Q funded by DOE through the US based lattice QCD consortium, USQCD; a Blue Gene/Q machine at the Edinburgh Parallel Computing Centre; the large installation of BlueGene/P (Intrepid) and Blue Gene/Q (Mira) machines at Argonne National Laboratory funded by the DOE Office of Science; and PC cluster machines at Fermi National Accelerator Laboratory and at RIKEN.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov

2013-4280 | INT/EXT | Newsroom