Do nuclear rings in barred galaxies form at the shear minimum of the rotation curve? [GA]

http://arxiv.org/abs/2002.10559


It has been recently suggested that (i) nuclear rings in barred galaxies (including our own Milky Way) form at the radius where the shear parameter of the rotation curve reaches a minimum; (ii) the acoustic instability of Montenegro et al. is responsible for driving the turbulence and angular momentum transport in the central regions of barred galaxies. Here we test these suggestions by running simple hydrodynamical simulations in a scale-free logarithmic barred potential. Since the rotation curve of this potential is scale-free, the shear minimum theory predicts that no ring should form. We find that in contrast to this prediction, a ring does form in the simulation, with morphology consistent with that of nuclear rings in real barred galaxies. This proves that the presence of a shear-minimum is not a necessary condition for the formation of a ring. We also find that perturbations that are predicted to be acoustically unstable wind up and eventually propagate off to infinity, so that the system is actually stable. We conclude that (i) the shear-minimum theory is an unlikely mechanism for the formation of nuclear rings in barred galaxies; (ii) the acoustic instability is a spurious result and may not be able to drive turbulence in the interstellar medium, at least for the case without self-gravity. The question of the role of turbulent viscosity remains open.

Read this paper on arXiv…

M. Sormani and Z. Li
Wed, 26 Feb 20
53/72

Comments: 6 pages, 2 figures. Comments welcome