Nuclear Physics Data Demand More Powerful Processing

Jefferson Lab and Brookhaven National Lab partner on a Software & Computing Round Table to track the leading edge of computing and foster collaboration

By Tamara Dietrich

computing round table banner image

The following story was original published by the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility, a key partner in Brookhaven National Laboratory’s effort to build a future Electron-Ion Collider.

Fans of the popular TV show “The Big Bang Theory” can picture the sitcom’s physicists standing at a whiteboard, staring hard at equations.

It’s an iconic image. But is that the future — or even the present — of how nuclear physicists do their jobs? Not really. Not when new experiments demand ever-more powerful data processing and thus ever-more-powerful software and computing.

“Scientists being at a blackboard and writing up some equations — that is not always the reality,” said Markus Diefenthaler, an experimental nuclear physicist at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility in Newport News, Virginia.

“We also write analysis programs and simulations to make sense of the vast amount of data that we have collected to try to learn about particle structure and dynamics,” Diefenthaler said. “The software and computing can really be an integral and a fundamental part of the science.”

So integral and so fundamental that in 2016, Diefenthaler helped organize the Software & Computing Round Table, a monthly forum for presentations and discussions among colleagues and Ph.D. students. Its stated goal: to explore the expanding role of software and computing in high energy and nuclear physics and related fields and to foster common projects within the scientific community.

The group has grown so successful that last year, organizers teamed up with the U.S. Department of Energy’s Brookhaven National Laboratory on Long Island, New York, to broaden their perspective, their offerings and their target audience.

The collaboration is particularly timely, as the DOE has chosen Brookhaven as the site for its proposed Electron-Ion Collider (EIC), a one-of-a-kind, next-generation facility considered critical to the future of physics research and particle accelerator technology in this country and around the world. Jefferson Lab is a major partner in realizing the EIC, providing key support for this next new collider.

Torre Wenaus, senior physicist at Brookhaven and leader of its Nuclear and Particle Physics Software group, said the EIC offers a blank canvas and unique opportunity to benefit from science community members with long experience working on multiple generations of software frameworks.

computing round table participants

Their expertise is invaluable for devising frameworks for so-called greenfield experiments — those in emerging areas that are still wide open for innovation, Wenaus said.

“The EIC is an opportunity to really take an expansive view in deciding how best to do things in a long-range project, without being bound by a lot of existing history and computing infrastructure, while still leveraging the experience that people bring to it from prior activities,” Wenaus said.

While the EIC project offers tantalizing possibilities for the round table, just as valuable are what the forum offers right now as Jefferson Lab and Brookhaven engage in world-class nuclear physics research that delivers ever-greater amounts of data, which demands ever-greater processing power.

“Higher luminosity means more data,” said Wenaus, “which means a bigger job processing the data at various stages, from initial decisions as to what data you save, to simulating the physics in our detector.”

For instance, Jefferson Lab’s upgraded 12 GeV Continuous Electron Beam Accelerator Facility, a DOE Office of Science User Facility, has the highest luminosity in the world, enabling experiments that probe deep into protons and neutrons to study quarks and gluons — the building blocks of the universe — like a mighty microscope. In one key experiment, called GlueX, researchers hope the CEBAF’s enhanced luminosity can produce new particles called hybrid mesons and answer the fundamental question of why no quark has ever been found alone. CEBAF’s high luminosity generates extreme amounts of data in experiments, with GlueX alone generating 1 GB per second.

And Brookhaven’s Relativistic Heavy Ion Collider (RHIC), also a DOE Office of Science User Facility, is the first in the world capable of smashing together heavy ions. Nuclear physicists use RHIC and its specialized detectors to study a state of matter called quark-gluon plasma. Continual upgrades at RHIC over its 20 years of operations have resulted in a 44-fold increase in luminosity, far beyond what was imagined when the facility was initially designed.

“These high-luminosity facilities with very complex detectors and data rates demand a lot of computing and really force us to track the leading edge of software and computing,” Wenaus said.

Sometimes, though, it’s physics that takes the lead. One example of computing and software advances flowing from physics has been the revolution in machine learning and artificial intelligence over the last eight years, he said. A key paper describing such deep learning approaches was published in 2012 just months after the discovery of the elusive Higgs boson elementary particle following a decades-long search. Such data analysis methods, however, have long been explored by physicists in efforts to better understand their data.

“It’s always interesting to me that everything we’ve done since the Higgs has tracked exactly the same time scale as the really exponential revolution in machine learning that we’ve done over that time,” Wenaus said.

computing round table participants

And for every high-energy physicist with a long-enough memory, he said, another favorite example of physics software and computing technology flowing to the wider world is the World Wide Web, developed at CERN in 1989 to enable scientists, universities and research facilities to share information.

There’s also been tremendous growth over the last 10 years in what’s known as common software in the science community. Such software has been around for decades, but has become increasingly available as one of many open-source options.

Common software will be a key topic of a virtual workshop that round table organizers are offering Sept. 29-Oct. 1. The Future Trends in Nuclear Physics Computing workshop is in lieu of a monthly round table and intended to chart a path for software and computing in nuclear physics for the next 10 years.

Organizers say the Software & Computing Round Table has helped inspire collaboration among physicists and move projects forward.

“Some of them, such as greenfield frameworks, are still taking shape, and the impact lies primarily in the future,” Wenaus said.

Computing is integral to modern science, Diefenthaler said, but its value extends beyond mere numbers.

“A quote which I really like is from one of the pioneers of computing, Richard Hamming,” said Diefenthaler. “He said, ‘The purpose of computing is insight, not numbers.’ And this is really what software and computing is giving us — it’s giving us insight into the scientific questions which we are trying to answer.”

German-born Diefenthaler joined Jefferson Lab in 2015 and is part of its EIC Center. He is investigating the inner structure of the nucleon, in particular the so-called TMD observables. Transverse momentum dependent observables are being explored to help map out the spin and momentum of the quarks and gluons inside protons and neutrons. Diefenthaler is part of the research collaboration that observed for the first time the Sivers effect, which relates TMD observables to the proton’s spin and the behavior of its quarks and gluons, in measurements of the semi-inclusive deep-inelastic scattering process and provided seminal results for many other TMD observables.

Wenaus has worked on nuclear physics at Brookhaven since 1997, but he has had many stints at CERN working on the Large Hadron Collider. He is a member of the ATLAS collaboration, which is one of four major experiments at the LHC and, along with the CMS collaboration, first observed the Higgs boson in 2012. Wenaus is also co-leading U.S. ATLAS efforts to help develop software for the High Luminosity Large Hadron Collider (HL-LHC) at CERN. The HL-LHC is an upgrade to the LHC that will significantly boost its acceleration. It’s expected to start taking data around 2027 at a rate that’s an order of magnitude greater than currently possible.

The round table was inspired by a popular workshop series called Future Trends in Nuclear Physics Computing. The first workshop was organized by Diefenthaler, along with Amber Boehnlein, now head of Jefferson Lab’s new Computational Sciences & Technology Division, and Graham Heyes, head of the lab’s Scientific Computing Department. The second workshop was organized by a group of 10 from six different institutions, and the third workshop, mentioned above, is being organized by the Software & Computing Round Table organizers.

Topics are chosen by an eight-member committee composed of physicists from Jefferson Lab and Brookhaven. Speakers come from fellow DOE national labs, international accelerator facilities and research universities.

“We really focus on programming, on how to process data, how to handle data, how to do the analysis,” Diefenthaler said.

Both physicists stress that anyone interested in learning more and/or contributing to the conversation about the interplay among the topics of nuclear and high energy physics, software and computing are invited to attend Software and Computing Round Table presentations and discussions.

Contact: Kandice Carter, Jefferson Lab Communications Office, kcarter@jlab.org

2020-17402  |  INT/EXT  |  Newsroom