Starlet l1-norm for weak lensing cosmology

Starlet l1-norm for weak lensing cosmology

 

Authors:

Virginia Ajani, Jean-Luc Starck, Valeria Pettorino

Journal:
Astronomy & Astrophysics , Forthcoming article, Letters to the Editor
Year: 01/2021
Download: A&A| Arxiv


Abstract

We present a new summary statistic for weak lensing observables, higher than second order, suitable for extracting non-Gaussian cosmological information and inferring cosmological parameters. We name this statistic the 'starlet 1-norm' as it is computed via the sum of the absolute values of the starlet (wavelet) decomposition coefficients of a weak lensing map. In comparison to the state-of-the-art higher-order statistics -- weak lensing peak counts and minimum counts, or the combination of the two -- the 1-norm provides a fast multi-scale calculation of the full void and peak distribution, avoiding the problem of defining what a peak is and what a void is: The 1-norm carries the information encoded in all pixels of the map, not just the ones in local maxima and minima. We show its potential by applying it to the weak lensing convergence maps provided by the MassiveNus simulations to get constraints on the sum of neutrino masses, the matter density parameter, and the amplitude of the primordial power spectrum. We find that, in an ideal setting without further systematics, the starlet 1-norm remarkably outperforms commonly used summary statistics, such as the power spectrum or the combination of peak and void counts, in terms of constraining power, representing a promising new unified framework to simultaneously account for the information encoded in peak counts and voids. We find that the starlet 1-norm outperforms the power spectrum by 72% on Mν60% on Ωm, and 75% on As for the Euclid-like setting considered; it also improves upon the state-of-the-art combination of peaks and voids for a single smoothing scale by 24% on Mν50% on Ωm, and 24% on As.

Euclid preparation: VII. Forecast validation for Euclid cosmological probes

Euclid: impact of nonlinear prescriptions on cosmological parameter estimation from weak lensing cosmic shear


Abstract

Aims: The Euclid space telescope will measure the shapes and redshifts of galaxies to reconstruct the expansion history of the Universe and the growth of cosmic structures. The estimation of the expected performance of the experiment, in terms of predicted constraints on cosmological parameters, has so far relied on various individual methodologies and numerical implementations, which were developed for different observational probes and for the combination thereof. In this paper we present validated forecasts, which combine both theoretical and observational ingredients for different cosmological probes. This work is presented to provide the community with reliable numerical codes and methods for Euclid cosmological forecasts.
Methods: We describe in detail the methods adopted for Fisher matrix forecasts, which were applied to galaxy clustering, weak lensing, and the combination thereof. We estimated the required accuracy for Euclid forecasts and outline a methodology for their development. We then compare and improve different numerical implementations, reaching uncertainties on the errors of cosmological parameters that are less than the required precision in all cases. Furthermore, we provide details on the validated implementations, some of which are made publicly available, in different programming languages, together with a reference training-set of input and output matrices for a set of specific models. These can be used by the reader to validate their own implementations if required.
Results: We present new cosmological forecasts for Euclid. We find that results depend on the specific cosmological model and remaining freedom in each setting, for example flat or non-flat spatial cosmologies, or different cuts at non-linear scales. The numerical implementations are now reliable for these settings. We present the results for an optimistic and a pessimistic choice for these types of settings. We demonstrate that the impact of cross-correlations is particularly relevant for models beyond a cosmological constant and may allow us to increase the dark energy figure of merit by at least a factor of three.

 

Euclid: The importance of galaxy clustering and weak lensing cross-correlations within the photometric Euclid survey

Euclid: impact of nonlinear prescriptions on cosmological parameter estimation from weak lensing cosmic shear


Abstract

Context. The data from the Euclid mission will enable the measurement of the angular positions and weak lensing shapes of over a billion galaxies, with their photometric redshifts obtained together with ground-based observations. This large dataset, with well-controlled systematic effects, will allow for cosmological analyses using the angular clustering of galaxies (GCph) and cosmic shear (WL). For Euclid, these two cosmological probes will not be independent because they will probe the same volume of the Universe. The cross-correlation (XC) between these probes can tighten constraints and is therefore important to quantify their impact for Euclid.
Aims: In this study, we therefore extend the recently published Euclid forecasts by carefully quantifying the impact of XC not only on the final parameter constraints for different cosmological models, but also on the nuisance parameters. In particular, we aim to decipher the amount of additional information that XC can provide for parameters encoding systematic effects, such as galaxy bias, intrinsic alignments (IAs), and knowledge of the redshift distributions.
Methods: We follow the Fisher matrix formalism and make use of previously validated codes. We also investigate a different galaxy bias model, which was obtained from the Flagship simulation, and additional photometric-redshift uncertainties; we also elucidate the impact of including the XC terms on constraining these latter.
Results: Starting with a baseline model, we show that the XC terms reduce the uncertainties on galaxy bias by ∼17% and the uncertainties on IA by a factor of about four. The XC terms also help in constraining the γ parameter for minimal modified gravity models. Concerning galaxy bias, we observe that the role of the XC terms on the final parameter constraints is qualitatively the same irrespective of the specific galaxy-bias model used. For IA, we show that the XC terms can help in distinguishing between different models, and that if IA terms are neglected then this can lead to significant biases on the cosmological parameters. Finally, we show that the XC terms can lead to a better determination of the mean of the photometric galaxy distributions.
Conclusions: We find that the XC between GCph and WL within the Euclid survey is necessary to extract the full information content from the data in future analyses. These terms help in better constraining the cosmological model, and also lead to a better understanding of the systematic effects that contaminate these probes. Furthermore, we find that XC significantly helps in constraining the mean of the photometric-redshift distributions, but, at the same time, it requires more precise knowledge of this mean with respect to single probes in order not to degrade the final "figure of merit".

XC importance
Ratio of the errors on Δzi\Delta z_iΔzi​ without and with the inclusion of XC. Yellow and red lines refer to the pessimistic and optimistic scenario.

 

Beyond self-acceleration: force- and fluid-acceleration

The notion of self acceleration has been introduced as a convenient way to theoretically distinguish cosmological models in which acceleration is due to modified gravity from those in which it is due to the properties of matter or fields. In this paper we review the concept of self acceleration as given, for example, by [1], and highlight two problems. First, that it applies only to universal couplings, and second, that it is too narrow, i.e. it excludes models in which the acceleration can be shown to be induced by a genuine modification of gravity, for instance coupled dark energy with a universal coupling, the Hu-Sawicki f(R) model or, in the context of inflation, the Starobinski model. We then propose two new, more general, concepts in its place: force-acceleration and field-acceleration, which are also applicable in presence of non universal cosmologies. We illustrate their concrete application with two examples, among the modified gravity classes which are still in agreement with current data, i.e. f(R) models and coupled dark energy.

As noted already for example in [35, 36], we further remark that at present non-universal couplings are among the (few) classes of models which survive gravitational wave detection and local constraints (see [12] for a review on models surviving with a universal coupling). This is because, by construction, baryonic interactions are standard and satisfy solar system constraints; furthermore the speed of gravitational waves in these models is  cT = 1 and therefore in agreement with gravitational wave detection. It has also been noted (see for example [37–39] and the update in [33]) that models in which a non-universal coupling between dark matter particles is considered would also solve the tension in the measurement of the Hubble parameter [40] due to the degeneracy beta - H0 first noted in Ref. [41].

Reference: L.Amendola, V.Pettorino  "Beyond self-acceleration: force- and fluid-acceleration", Physics Letters B, in press, 2020.

The first Deep Learning reconstruction of dark matter maps from weak lensing observational data

DeepMass: The first Deep Learning reconstruction of dark matter maps from weak lensing observational data (DES SV weak lensing data)

DeepMass

 This is the first reconstruction of dark matter maps from weak lensing observational data using deep learning. We train a convolution neural network (CNN) with a Unet based architecture on over 3.6 x 10^5 simulated data realisations with non-Gaussian shape noise and with cosmological parameters varying over a broad prior distribution.  Our DeepMass method is substantially more accurate than existing mass-mapping methods. With a validation set of 8000 simulated DES SV data realisations, compared to Wiener filtering with a fixed power spectrum, the DeepMass method improved the mean-square-error (MSE) by 11 per cent. With N-body simulated MICE mock data, we show that Wiener filtering with the optimal known power spectrum still gives a worse MSE than our generalised method with no input cosmological parameters; we show that the improvement is driven by the non-linear structures in the convergence. With higher galaxy density in future weak lensing data unveiling more non-linear scales, it is likely that deep learning will be a leading approach for mass mapping with Euclid and LSST.

Reference 1:  N. Jeffrey, F.  Lanusse, O. Lahav, J.-L. Starck,  "Learning dark matter map reconstructions from DES SV weak lensing data", Monthly Notices of the Royal Astronomical Society, in press, 2019.

 

Diffuse Galactic thermal dust emission: modified black-body parameter maps

Diffuse Galactic thermal dust emission: modified black-body parameter maps


Diffuse emissions are ubiquitous within our Galaxy. They probe star-forming regions, the chemical composition of the Galaxy and the Galactic magnetic field. Conversely, they also obscure cosmological measurements such as the cosmic microwave background and the epoch of reionisation signal. Detailed characterisation of these emissions is of interest to both cosmologists and astrophysicists. In early 2019 CosmoStat released our contribution to the investigation of thermal dust emission in the form of modified black-body temperature, spectral index and optical depth maps (to be found here). These intensity maps are presented at Nside 2048, FWHM 5 arcmin and were made from Planck (data release 2) HFI and IRAS data.

Reference 1: M. Irfan, J. Bobin, M-A. Miville-Deschenes and I. Grenier, "Determining thermal dust emission from Planck HFI data using a sparse parametric technique ", A&A, 623, 03/2019.

 

Weak Lensing 2D and 3D Density Fluctuation Map Reconstruction

Weak lensing 2D & 3D density fluctuation map reconstruction


The 3D tomographic weak lensing is one of the most important tools for modern cosmology:  Underlying the link between weak lensing and the compressed sensing theory, we have proposed a  new approach to reconstruct the dark matter distribution in two and three dimensions, using photometric redshift information. We have shown  that we can estimate with a very good accuracy the mass and redshift of dark matter haloes, which is crucial for unveiling the nature of the Dark Universe (Leonard et al. 2014). We have shown that it outperforms significantly all existing methods. In particular, we have seen using simulations that we can reconstruct two clusters on the same light of sight, which was impossible with previous methods.  The method has be chosen by the DES consortium to general its weak lensing mass map (Jeffrey et al, 2018).

Reference 1: A. Leonard, F. Lanusse and J.-L. Starck, "GLIMPSE: Accurate 3D weak lensing reconstructions using sparsity", MNRAS, 440, 2, 2014.

Reference 2: F. Lanusse, J.-L. Starck, A. Leonard, S. Pires, "High Resolution Weak Lensing Mass-Mapping Combining Shear and Flexion", Astronomy and Astrophysics, 591, id.A2, 19 pp, 2016.

Reference 3: Niall Jeffrey et al., MNRAS, 479, 2018, arXiv:1801.08945.

Press release: CEA press release

 

Cosmology and Fundamental Physics with the Euclid Satellite

Cosmology and Fundamental Physics with the Euclid Satellite


Understanding the source of cosmic acceleration in the universe is one of the major challenges that will be addressed by future surveys like the Euclid space mission. Acceleration may be caused by a cosmological constant or by a dynamical fluid (dark energy) or rather be a sign that the laws of gravity themselves are different at very large scales. Euclid data interpretation will aim at discriminating among these scenarios. CosmoStat is active in the Theory Working Group, and V.Pettorino led the update of the Review Cosmology and Fundamental Physics with the Euclid Satellite, published on Living Reviews in Relativity in 2018 https://link.springer.com/article/10.1007%2Fs41114-017-0010-3. The figure below (originally from Hu and Sawicki (2007a), replotted as Fig.19 of the Review, shows constraints expected for Euclid on the growth factor, for different cosmological scenarios.

Reference : Euclid Theory Working Group, Cosmology and Fundamental Physics with the Euclid Satellite,

DOI: https://doi.org/10.1007/s41114-017-0010-3 published on 12 April 2018.

Radio-Interferometry: Improving the Resolution by a Factor of 4 (2 in each spatial dimension)

Radio-Interferometry: Improving the Resolution by a Factor of 4 (2 in each spatial dimension)


Sparse recovery allows us to reconstruct radio-interferometric images with a resolution increased by a factor two. This has been confirmed by comparing two images of the Cygnus A radio source, the first one from the LOFAR instrument and reconstructed using sparsity, and the second one from the Very Large Array at a higher frequency (and therefore with a better resolution). The contour of the VLA image in Fig 1 right matches perfectly the high resolution features that can be seen in the LOFAR color map image, while in Fig 1 left, we can see what the LOFAR pipeline produces on the same data. All small details which appear in the color image at right, but not in the left image are real, since they are also there in the contours, which corresponds to the VLA image at better resolution.

Reference:  Garsden, Girard, Starck, Corbel et al, A&A, 2015

Press release 1: http://irfu.cea.fr/dap/Phocea/Vie_des_labos/Ast/ast.php?t=actu&id_ast=3548

Press release 2: French researchers push forward radio image quality in view of SKA,  Thursday 10 November 2016.

Press release 3: Astrophysique et IRM, un mariage qui a du sens, May 17, 2017.

Cosmic Microwave Background: Joint WMAP/Planck CMB Map Recovery

Cosmic Microwave Background: Joint WMAP/Planck CMB Map Recovery


The LGMCA method has been used to reconstruct the Cosmic Microwave Background (CMB) image from WMAP 9 year and Planck-PR2 data. Based on the sparse modeling of signals , the LCS component separation method is well-suited for the extraction of foreground emissions. A joint WMAP9 year and Planck PR2 CMB has been reconstructed produce a very high quality CMB map, especially on the galactic center where it is the most difficult due to the strong foreground emissions of our Galaxy. This LGMCA CMB map estimate exhibits appealing characteristics for astrophysical and cosmological applications: i) it is a full sky map that did not require any inpainting or interpolation post-processing, ii) foreground contamination is showed to be very low even on the galactic center, iii) it does not exhibit any detectable trace of thermal SZ contamination. Furthermore,  following the principle of reproducible research, LCS provides the codes to reproduce the LGMCA map, which makes it the only reproducible CMB map.


Reference 1: J. Bobin, F. Sureau, P. Paykari, A. Rassat, S. Basak and J.-L. Starck, "WMAP 9-year CMB estimation using sparsity", Astronomy and Astrophysics , 553, L4, pp 10, 2013.
Reference 2: J. Bobin, F. Sureau and J.-L. Starck, "CMB reconstruction from the WMAP and Planck PR2 data", Astronomy and Astrophysics, 591, id.A50, 12 pp, 2016.

Press release: http://jstarck.free.fr/Defis_CEA_CMB_March2015.pdf

Web: http://www.cosmostat.org/research-topics/cmb