Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 15, 2022 | Published + Supplemental Material + Accepted Version
Journal Article Open

A scalable elliptic solver with task-based parallelism for the SpECTRE numerical relativity code

Abstract

Elliptic partial differential equations must be solved numerically for many problems in numerical relativity, such as initial data for every simulation of merging black holes and neutron stars. Existing elliptic solvers can take multiple days to solve these problems at high resolution and when matter is involved, because they are either hard to parallelize or require a large amount of computational resources. Here we present a new solver for linear and nonlinear elliptic problems that is designed to scale with resolution and to parallelize on computing clusters. To achieve this we employ a discontinuous Galerkin discretization, an iterative multigrid-Schwarz preconditioned Newton-Krylov algorithm, and a task-based parallelism paradigm. To accelerate convergence of the elliptic solver we have developed novel subdomain-preconditioning techniques. We find that our multigrid-Schwarz preconditioned elliptic solves achieve iteration counts that are independent of resolution, and our task-based parallel programs scale over 200 million degrees of freedom to at least a few thousand cores. Our new code solves a classic initial data problem for binary black holes faster than the spectral code SpEC when distributed to only eight cores, and in a fraction of the time on more cores. It is publicly accessible in the next-generation SpECTRE numerical relativity code. Our results pave the way for highly parallel elliptic solves in numerical relativity and beyond.

Additional Information

© 2022 Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI. Open access publication funded by the Max Planck Society. Received 15 November 2021; accepted 8 February 2022; published 18 April 2022. The authors thank Tim Dietrich, Francois Foucart, and Hannes Rüter for helpful discussions. N. V. also thanks the Cornell Center for Astrophysics and Planetary Science and TAPIR at Caltech for the hospitality and financial support during research stays. Computations were performed with the SpECTRE code [29] on the Minerva cluster at the Max Planck Institute for Gravitational Physics. Charm++ [38] was developed by the Parallel Programming Laboratory in the Department of Computer Science at the University of Illinois at Urbana-Champaign. The figures in this article were produced with dgpy [76], matplotlib [77,78], TikZ [79] and ParaView [80]. This work was supported in part by the Sherman Fairchild Foundation and by NSF Grants No. PHY-2011961, No. PHY-2011968, and No. OAC-1931266 at Caltech, and NSF Grants No. PHY-1912081 and No. OAC-1931280 at Cornell. G. L. is pleased to acknowledge support from the NSF through Grants No. PHY-1654359 and No. AST-1559694 and from Nicholas and Lee Begovich and the Dan Black Family Trust.

Attached Files

Published - PhysRevD.105.084027.pdf

Accepted Version - 2111.06767.pdf

Supplemental Material - Bbh.yaml

Supplemental Material - KerrSchild.yaml

Supplemental Material - Poisson.yaml

Files

PhysRevD.105.084027.pdf
Files (8.2 MB)
Name Size Download all
md5:d6b81bb44ac366d80aa7413844d682fd
1.8 kB Download
md5:6a9a06440cfcc55f03f28f9d664c4a4a
3.1 kB Download
md5:6ac58ae5b456a5a76a190fc66aabb1a9
2.8 MB Preview Download
md5:03b262b897bede260b6405ac830e2322
5.4 MB Preview Download
md5:0a1a2d6226ada7802edfb9bde3847a00
6.8 kB Download

Additional details

Created:
August 20, 2023
Modified:
October 24, 2023