Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 10, 2019 | Submitted + Published
Journal Article Open

The Impact of Enhanced Halo Resolution on the Simulated Circumgalactic Medium

Abstract

Traditional cosmological hydrodynamics simulations fail to spatially resolve the circumgalactic medium (CGM), the reservoir of tenuous gas surrounding a galaxy and extending to its virial radius. We introduce the technique of Enhanced Halo Resolution (EHR), enabling more realistic physical modeling of the simulated CGM by consistently forcing gas refinement to smaller scales throughout the virial halo of a simulated galaxy. We investigate the effects of EHR in the tempest simulations, a suite of enzo-based cosmological zoom simulations following the evolution of an L* galaxy, resolving spatial scales of 500 comoving pc out to 100 comoving kpc in galactocentric radius. Among its many effects, EHR (1) changes the thermal balance of the CGM, increasing its cool gas content and decreasing its warm/hot gas content; (2) preserves cool gas structures for longer periods; and (3) enables these cool clouds to exist at progressively smaller size scales. Observationally, this results in a boost in "low ions" like H i and a drop in "high ions" like O vi throughout the CGM. These effects of EHR do not converge in the tempest simulations, but extrapolating these trends suggests that the CGM is actually a mist consisting of ubiquitous, small, long-lived, cool clouds suspended in a medium at the halo virial temperature. We find that EHR produces the above effects by (1) better sampling the distribution of CGM phases, enabling runaway cooling in the dense, cool tail of the phase distribution; and (2) preventing cool gas clouds from artificially mixing with the ambient hot halo and evaporating.

Additional Information

© 2019 The American Astronomical Society. Received 2018 November 29; revised 2019 July 17; accepted 2019 July 30; published 2019 September 13. The authors are very grateful to X. Prochaska, Brant Robertson, Kate Rubin, Greg Bryan, Gwen Rudie, Shea Garrison-Kimmel, Claude-André Faucher-Giguère, Zach Hafen, Max Gronke, Sean Johnson, and Anatoly Klypin for helpful discussions. In particular, we thank Matt Turk, Nathan Goldbaum, John Zuhone, Kacper Kowalik, Bili Dong, and other members of our yt developer community, who were invaluable in producing and maintaining much of the code necessary to process these data sets. Support for C.B.H. and N.L. was provided by NASA through Hubble Space Telescope (HST) theory grants HST-AR-13917 and HST-AR-13919 with additional funding for C.B.H. and D.S. from the National Science Foundation (NSF) Astronomy and Astrophysics Postdoctoral Fellowship program. B.D.S. was supported by NSF grant AST-1615848. Support for P.F.H. was provided by an Alfred P. Sloan Research Fellowship, NSF Collaborative Research grant #1715847 and CAREER grant #1455342, and NASA grants NNX15AT06G, JPL 1589742, 17-ATP17-0214. B.W.O. was funded in part by NSF grants PHY-1430152, AST-1514700, AST-1517908, OAC-1835213; by NASA grants NNX12AC98G, NNX15AP39G; and by HST-AR-14315. J.H.W. was supported by NSF grants AST-1614333 and OAC-1835213, NASA grant NNX17AG23G, and HST theory grant HST-AR-14326. I.S.B. received partial support from HST-AR-15046. These simulations were performed and analyzed on Blue Waters, operated by the National Center for Supercomputing Applications (NCSA) with PRAC allocation support by the NSF (award number ACI-1514580) and the Great Lakes Consortium for Petascale Computation. This research is part of the Blue Waters sustained-petascale computing project, which is supported by the NSF (awards OCI-0725070, ACI-1238993, and ACI-1514580) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign and its NCSA. Additional computational resources for this work are from NSF XSEDE resources under allocation TG-AST140018. Computations described in this work were performed using the publicly available enzo code, which is the product of a collaborative effort of many independent scientists from numerous institutions around the world. Software: enzo (Bryan et al. 2014), grackle (Smith et al. 2017), matplotlib (Hunter 2007), numpy (Van Der Walt et al. 2011), trident (Hummels et al. 2017), yt (Turk et al. 2011), yt astro analysis (Smith et al. 2018) ytree (Smith & Lang 2018).

Attached Files

Published - Hummels_2019_ApJ_882_156.pdf

Submitted - 1811.12410.pdf

Files

1811.12410.pdf
Files (6.3 MB)
Name Size Download all
md5:4828753ee590f46d14d98089b81e5ca8
4.5 MB Preview Download
md5:b452fee3890db5aeec3b23eb545c8305
1.8 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023