EU-funded researchers have helped generate the most accurate map to date of dark matter, the mysterious substance that makes up 80 % of the universe. The innovative big-data technologies they used will have a significant impact on fields as diverse as astrophysics and biomedical imaging.
Galaxy cluster counts in bins of mass and redshift have been shown to be a competitive probe to test cosmological models. This method requires an efficient blind detection of clusters from surveys with a well-known selection function and robust mass estimates. The Euclid wide survey will cover 15000 deg2 of the sky in the optical and near-infrared bands, down to magnitude 24 in the H-band. The resulting data will make it possible to detect a large number of galaxy clusters spanning a wide-range of masses up to redshift ∼2. This paper presents the final results of the Euclid Cluster Finder Challenge (CFC). The objective of these challenges was to select the cluster detection algorithms that best meet the requirements of the Euclid mission. The final CFC included six independent detection algorithms, based on different techniques, such as photometric redshift tomography, optimal filtering, hierarchical approach, wavelet and friend-of-friends algorithms. These algorithms were blindly applied to a mock galaxy catalog with representative Euclid-like properties. The relative performance of the algorithms was assessed by matching the resulting detections to known clusters in the simulations. Several matching procedures were tested, thus making it possible to estimate the associated systematic effects on completeness to <3%. All the tested algorithms are very competitive in terms of performance, with three of them reaching >80% completeness for a mean purity of 80% down to masses of 1014 M⊙ and up to redshift z=2. Based on these results, two algorithms were selected to be implemented in the Euclid pipeline, the AMICO code, based on matched filtering, and the PZWav code, based on an adaptive wavelet approach.
Big Data could hold the key to some of the most complex phenomena described in science – provided that we can make sense of its dizzying quantities of information. The DEDALE project developed algorithms enabling just that and used them to measure the amount of dark matter in the universe.
Current trends in scientific imaging are challenged by the emerging need of integrating sophisticated machine learning with Big Data analytics platforms. This work proposes an in-memory distributed learning architecture for enabling sophisticated learning and optimization techniques on scientific imaging problems, which are characterized by the combination of variant information from different origins. We apply the resulting, Spark-compliant, architecture on two emerging use cases from the scientific imaging domain, namely: (a) the space variant deconvolution of galaxy imaging surveys (astrophysics), (b) the super-resolution based on coupled dictionary training (remote sensing). We conduct evaluation studies considering relevant datasets, and the results report at least 60\% improvement in time response against the conventional computing solutions. Ultimately, the offered discussion provides useful practical insights on the impact of key Spark tuning parameters on the speedup achieved, and the memory/disk footprint.
Weak lensing 2D & 3D density fluctuation map reconstruction
The 3D tomographic weak lensing is one of the most important tools for modern cosmology: Underlying the link between weak lensing and the compressed sensing theory, we have proposed a new approach to reconstruct the dark matter distribution in two and three dimensions, using photometric redshift information. We have shown that we can estimate with a very good accuracy the mass and redshift of dark matter haloes, which is crucial for unveiling the nature of the Dark Universe (Leonard et al. 2014). We have shown that it outperforms significantly all existing methods. In particular, we have seen using simulations that we can reconstruct two clusters on the same light of sight, which was impossible with previous methods. The method has be chosen by the DES consortium to general its weak lensing mass map (Jeffrey et al, 2018).
Reference 1: A. Leonard, F. Lanusse and J.-L. Starck, "GLIMPSE: Accurate 3D weak lensing reconstructions using sparsity", MNRAS, 440, 2, 2014.
Reference 2: F. Lanusse, J.-L. Starck, A. Leonard, S. Pires, "High Resolution Weak Lensing Mass-Mapping Combining Shear and Flexion", Astronomy and Astrophysics, 591, id.A2, 19 pp, 2016.
Reference 3: Niall Jeffrey et al., MNRAS, 479, 2018, arXiv:1801.08945.
Press release: CEA press release
Co-organised by CosmoStat on May 20-22 2018 in Valencia, Spain.
Tutorial on “Sparsity” by Samuel Farrens.