1. Computational Science Initiative Event

    "The AMReX Astrophysics Suite: Simulating the Stars at the Exascale"

    Presented by Michael Zingale, Associate Professor, Dept. Of Physics and Astronomy, Stony Brook University

    Thursday, July 27, 2017, 1:30 pm
    Seminar Room, Bldg. 725

    Hosted by: Meifeng Lin

    Astronomy is an observational science — we take data (primarily light) from the objects in the Universe and use this to infer how systems work. Astrophysical simulations allow us to perform virtual experiments on these systems, giving us the ability to see into stars in a way that light alone does not allow. Stellar systems can be modeled using the equations of hydrodynamics, together with nuclear reactions, self-gravity, complex equations of state, and at times, radiation (and magnetic fields). The resulting simulation codes are multiphysics and multiscale, and a variety of techniques have been developed to permit accurate and efficient simulations. We describe the adaptive mesh refinement (AMR) codes for astrophysics built upon the AMReX library: the AMReX Astrophysics Suite. We'll focus on the codes for stellar / nuclear astrophysics: Maestro and Castro. Maestro models subsonic stellar flows while Castro focuses on highly-compressible flows. They share the same microphysics (reaction networks, equations of state) and parallelization strategy. Through AMReX, we distribute boxes in our AMR hierarchy across nodes and we use OpenMP (via a logical tiling model in Castro) to spread the work on a box across cores in a node. Recently we've implemented a GPU strategy in AMReX that allows us to move the computational kernels onto GPUs to offload expensive calculations. We'll discuss the current performance of the hydrodynamics and reaction networks on GPUs and how our strategy will evolve in the future.