The following news release announces new funding for the Open Science Grid (OSG), a distributed computing resource for the scientific community. The release was prepared by the OSG Consortium and is being issued jointly by the University of Wisconsin–Madison and the U.S. Department of Energy’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory (Fermilab).
Karen McNulty Walsh, Brookhaven National Laboratory, 631-344-8350, email@example.com
Katie Yurkewicz, Fermilab Office of Communication, 630-840-3351, Katie@fnal.gov
Miron Livny, University of Wisconsin–Madison, 608-316-4336, firstname.lastname@example.org
June 20, 2012
UPTON, N.Y. — Every day researchers add another sea of data to an ocean of knowledge on the world around us — billions on top of billions of measurements, images and observations of the tiniest subatomic particles up to the movement of planets and stars.
“Making sense of that — simulating, mapping, analyzing — this is how researchers work these days,” said Miron Livny, computer sciences professor at the University of Wisconsin–Madison. “More and more researchers need more and more computing power to support that work.”
To that end, the Department of Energy Office of Science and the National Science Foundation have committed up to $27 million to Open Science Grid (OSG), a nine-member partnership extending the reach of distributed high-throughput computing (DHTC) capabilities.
Distributed computing musters the power of a network of machines that reside at different institutions to make the best use of all available processing and storage capacity, giving scientists the muscle of a supercomputer that may otherwise be out of reach.
Expanded over the last six years to include more than 80 sites contributing users and data storage and processing capacity, OSG now delivers more than 2 million computing hours and moves about a third of a petabyte of data on a daily basis.
“The commitment from the two agencies will take the capabilities and culture we’ve developed to more campuses throughout the United States,” said Livny, OSG’s principal investigator. “It is about advancing the state of the art to support education and research in more science domains and improve our ability to handle more data.”
The OSG Consortium bridges organizational boundaries, working directly with faculty, students and system administrators at campuses across the nation, as well as large multi-national scientific collaborations such as the Large Hadron Collider at the European Center for Nuclear Research.
Michael Ernst, an Open Science Grid co-principal investigator who directs the RHIC/ATLAS Computing Facility at Brookhaven National Laboratory also and coordinates computing activities across the United States for ATLAS, one of the largest particle physics experiments at Europe’s Large Hadron Collider (LHC). “Computing know-how is providing our research in nuclear and particle physics at Brookhaven a significant advantage in the global race for discoveries at the Relativistic Heavy Ion Collider (RHIC) and the LHC,” he said. “As a computational scientist who invents, develops, and ultimately operates new ways of doing science with computers, I am very excited when the tools we develop for use in nuclear and particle physics find widespread application across biology, chemistry, economics, engineering, mathematics, medicine, and physics.”
“Our close partnerships allow us to build on existing experience in working with and processing Big Data and the advanced networks needed to transport the massive datasets of the future,” said Michael Ernst, an OSG co-principal investigator who directs the Brookhaven National Laboratory RHIC/ATLAS Computing Facility and coordinates computing activities across the United States for ATLAS, one of the LHC’s largest particle physics experiments.
“Moving forward, the OSG will continue to bring these principles and technologies to the benefit of new research communities, and also expand its services, integrating networks, data and ever more complex user workflows,” said Lothar Bauerdick, OSG executive director and head of the U.S. LHC Compact Muon Solenoid experiment software and computing project.
OSG, a full partner in the NSF Extreme Digital program and a member of the XSEDE federation, will field new tools for distributed computing to facilitate sharing of computational resources both on and between campuses.
“The OSG has been developing the Virtual Data Toolkit for over 10 years,” said Frank Würthwein, an OSG co-principal investigator and physics professor at the University of California–San Diego. “This software service acts as an anchor for the DHTC community, supporting components that researchers need but are no longer supported elsewhere. Over the next five years, the OSG software services will expand into new, more community-specific, integrated software solutions via the VDT.”
The DOE Office of Science portion of the funding—up to $8.2 million — will support distributed computing efforts based at DOE national laboratories that make masses of data from experiments at the LHC available to U.S. researchers at their home institutions. The balance of the funding, contributed by NSF, will be used to promote distributed computing resources at U.S. universities.
Under the new award, nine institutions will receive funding: Brookhaven National Laboratory, Fermi National Accelerator Laboratory, University of Chicago, University of Wisconsin Madison, Indiana University, University of California San Diego, University of Illinois at Urbana-Champaign, University of Nebraska, and the Information Sciences Institute at the University of Southern California.
“The members of the OSG Consortium are fully committed to collaborating over the next five years to make this project a success,” said Ruth Pordes, chair of the OSG Council and the Fermilab Computing Sector associate head for Grids. “By working together, the OSG project and the scientists who use the OSG will be able to achieve great things.”
2012-1427 | Media & Communications Office
This is a print-friendly version of this news release. To see the full content, go to: