http://arxiv.org/abs/2209.05846
Reconstructing the large scale density and velocity fields from surveys of galaxy distances, is a major challenge for cosmography. The data is very noisy and sparse. Estimated distances, and thereby peculiar velocities, are strongly affected by the Malmquist-like lognormal bias. Two algorithms have been recently introduced to perform reconstructions from such data: the Bias Gaussian correction coupled with the Wiener filter (BGc/WF) and the HAMLET implementation of the Hamiltonian Monte Carlo forward modelling. The two methods are tested here against mock catalogs that mimic the Cosmicflows-3 data. Specifically the reconstructed cosmography and moments of the velocity field (monopole, dipole) are examined. A comparison is made to the “exact” wiener filter as well – namely the Wiener Filter in the unrealistic case of zero observational errors. This is to understand the limits of the WF method. The following is found. In the nearby regime ($d \lesssim 40 {\rm Mpc}/h$) the two methods perform roughly equally well. HAMLET does slightly better in the intermediate regime ($ 40 \lesssim d \lesssim 120 {\rm Mpc}/h$). The main differences between the two appear in the most distant regime ($d \gtrsim 120 {\rm Mpc}/h$), close to the edge of the data. The HAMLET outperforms the BGc/WF in terms of better and tighter correlations, yet in the distant regime the HAMLET yields a somewhat biased reconstruction. Such biases are missing from the BGc/WF reconstruction. In sum, both methods perform well and create reliable reconstructions with significant differences apparent when details are examined.
A. Valade, N. Libeskind, Y. Hoffman, et. al.
Wed, 14 Sep 22
53/90
Comments: 13 pages, 12 figures
You must be logged in to post a comment.