Turbulent Heating in a Stratified Medium [GA]

http://arxiv.org/abs/2205.01732


There is considerable evidence for widespread subsonic turbulence in galaxy clusters, most notably from {\it Hitomi}. Turbulence is often invoked to offset radiative losses in cluster cores, both by direct dissipation and by enabling turbulent heat diffusion. However, in a stratified medium, buoyancy forces oppose radial motions, making turbulence anisotropic. This can be quantified via the Froude number ${\rm Fr}$, which decreases inward in clusters as stratification increases. We exploit analogies with MHD turbulence to show that wave-turbulence interactions increase cascade times and reduces dissipation rates $\epsilon \propto {\rm Fr}$. Equivalently, for a given energy injection/dissipation rate $\epsilon$, turbulent velocities $u$ must be higher compared to Kolmogorov scalings. High resolution hydrodynamic simulations show excellent agreement with the $\epsilon \propto {\rm Fr}$ scaling, which sets in for ${\rm Fr} < 0.1$. We also compare previously predicted scalings for the turbulent diffusion coefficient $D \propto {\rm Fr}^2$ and find excellent agreement, for ${\rm Fr} < 1$. However, we find a different normalization, corresponding to stronger diffusive suppression by more than an order of magnitude. Our results imply that turbulent diffusion is more heavily suppressed by stratification, over a much wider radial range, than turbulent dissipation. Thus, the latter potentially dominates. Furthermore, this shift implies significantly higher turbulent velocities required to offset cooling, compared to previous models. These results are potentially relevant to turbulent metal diffusion (which is likewise suppressed), and to planetary atmospheres.

Read this paper on arXiv…

C. Wang, S. Oh and M. Ruszkowski
Thu, 5 May 22
48/51

Comments: N/A