Computation and Data-Driven Discovery (C3D)See all programs »
Data-Intensive Research Programs
As the sole Tier-1 computing facility in the United States for the Large Hadron Collider's ATLAS experiment—and the largest ATLAS computing center worldwide—Brookhaven's RHIC and ATLAS Computing Facility provides a large portion of the overall computing resources for U.S. collaborators and serves as the central U.S. distribution hub for ATLAS experimental data.
A systems biology knowledgebase known as KBase seeks to integrate and make broadly accessible everything we know or can learn about plants and microbes. Kbase will be a unique resource, bringing together multiple research communities and empowering them with computational tools to address fundamental biological questions.
Brookhaven is using high-performance computing (HPC) methods to assess electrical grid planning and performance using the power industry's existing software tools.
New detector technology planned for NSLS-II requires new computing approaches to experiment control, data management, data analysis workflow, and metadata handling with high performance on data storage and retrieval, as well as visualization.
The Center for Functional Nanomaterials Computation Facility and Theory Group provides multi-purpose high-performance computing for internal projects. Its resources include 2,200 computing cores, supported by high-speed networking to facilitate intensive, parallel computing as well as data storage.
National Nuclear Data Center
The National Nuclear Data Center collects experimental information on nuclear structure and nuclear reactions, evaluates them, maintains nuclear databases, and uses modern information technology to disseminate the results.
Environmental Data Mgmt.
Brookhaven's Environmental Sciences Department Environmental Data Management Group created and manages an External Data Center for the Department of Energy's Atmospheric Radiation Measurement Climate Research Facility.
Lattice Quantum Chromodynamics (Lattice QCD) simulates the interaction between quarks and gluons using Monte Carlo methods based on the theory of Quantum Chromodynamics. Lattice QCD has been successfully applied to study the QCD phase transition, CP violation, hadron structure and various other important topics in theoretical nuclear and high energy physics.
CSI’s Computation and Data-Driven Discovery department, or C3D, is the gateway for expertise in high-performance workflows and distributed computing technologies, integrating high-performance computing, machine learning, and streaming-analytics then translating them into scientific discovery.
C3D conducts research; develops solutions; and provides expertise in workflows, scalable software, and HPC to address the challenges and requirements of science and engineering applications that demand large-scale, innovative solutions. The team’s work includes performing advanced research into extreme-scale and extensible workflow systems, while enabling novel workflow applications on leadership-class supercomputers. C3D specializes in distributed computing research and software systems to support streaming and real-time data analytics.
For additional information and collaboration opportunities: BrookhavenLabCS@bnl.gov.
- Complex Modeling of Nanostructures
- Deep Learning for Analysis of Materials Science Data
- Dynamic Visualization and Visual Analytics for Scientific Data
- Experimental Data Curation with the Open-source Invenio Platform
- High Performance Iterative Tomography Reconstructions
- Machine Learning Assisted Material Discovery
- Provenance and Reproducibility Tools for Multi-Modal X-ray Spectroscopy Experiments
- Replicating Machine Learning Experiments in Materials Sciences
- Text Mining the Scientific Literature for X-ray Absorption Spectroscopy