How much (robust) information is there in galaxy clustering?

All large-scale structure cosmologists are faced with the question: how do we robustly extract cosmological information, such as on dark energy, gravity, and inflation, from observed tracers such as galaxies whose astrophysics is extremely complex and incompletely understood? I will describe why guaranteeing this robustness is so difficult, and how a perturbative effective-field-theory (EFT) approach offers such a guarantee when focusing on galaxy clustering on large scales. The natural next question then is: how much cosmological information is left on these large scales if we marginalize over all the free parameters introduced in the EFT? To answer this question, I will introduce our implementation of the EFT on a lattice as an explicit field-level forward model, which can be used both for full Bayesian inference at the field level and for likelihood-free inference based on summary statistics. One crucial advantage of this forward model is the non-perturbative treatment of the displacement from initial positions to observed coordinates, with ramifications for BAO reconstruction and redshift-space distortions.



Fabian Schmidt


Max Planck Institute for Astrophysics