Au terme de trois années de travail, une équipe de la collaboration Euclid, coordonnée par l’Irfu, dévoile une nouvelle méthode pour traiter conjointement les observations ciblant spécifiquement la matière noire ou l’énergie noire, deux concepts distincts mais corrélés. Résultat : une précision de l’interprétation cosmologique grandement améliorée !
Déterminer la cause de l’accélération cosmique est l’un des grands défis de la cosmologie. S’agit-il d’une constante (ΛCDM) ou d’un nouveau fluide (énergie noire, DE) ? Existe-t-il une nouvelle force qui modifie la gravité telle que décrite par Einstein (gravité modifiée, MG) ?
Le CEA travaille sur l’analyse de l’énergie noire et de la gravité modifiée dans le cadre de la mission Planck de l’Agence spatiale européenne (ESA) qui a mesuré le rayonnement du fond diffus cosmologique (CMB), la lumière émise 380 000 ans après le big bang. Dans la publication finale des données, nous avons actualisé et testé différents scénarios combinant les résultats de Planck à d’autres jeux de données.
Clefs CEA n°69 – L’intelligence Artificielle, Novembre 2019.
En astrophysique, à l’instar de nombreux autres domaines scientifiques, le machine learning est devenu incontournable ces dernières années, et pour un très large éventail de problèmes: restauration d’images, classification et caractérisation des étoiles ou des galaxies, séparation automatique des étoiles des galaxies dans les images, simulation numérique d’observations ou de distribution de matière dans l’Univers…
Clefs CEA n°68 – Dernières Nouvelles du Cosmos, Avril 2019.
Couplant mathématiques appliquées et astrophysique.
Deep learning is starting to offer promising results for reconstruction in Magnetic Resonance Imaging (MRI). A lot of networks are being developed, but the comparisons remain hard because the frameworks used are not the same among studies, the networks are not properly re-trained, and the datasets used are not the same among comparisons. The recent release of a public dataset, fastMRI, consisting of raw k-space data, encouraged us to write a consistent benchmark of several deep neural networks for MR image reconstruction. This paper shows the results obtained for this benchmark, allowing to compare the networks, and links the open source implementation of all these networks in Keras. The main finding of this benchmark is that it is beneficial to perform more iterations between the image and the measurement spaces compared to having a deeper per-space network.
Reference: Z. Ramzi, P. Ciuciu and J.-L. Starck. “Benchmarking MRI reconstruction neural networks on large public datasets”, Applied Sciences, 10, 1816, 2020. doi:10.3390/app10051816
The notion of self acceleration has been introduced as a convenient way to theoretically distinguish cosmological models in which acceleration is due to modified gravity from those in which it is due to the properties of matter or fields. In this paper we review the concept of self acceleration as given, for example, by , and highlight two problems. First, that it applies only to universal couplings, and second, that it is too narrow, i.e. it excludes models in which the acceleration can be shown to be induced by a genuine modification of gravity, for instance coupled dark energy with a universal coupling, the Hu-Sawicki f(R) model or, in the context of inflation, the Starobinski model. We then propose two new, more general, concepts in its place: force-acceleration and field-acceleration, which are also applicable in presence of non universal cosmologies. We illustrate their concrete application with two examples, among the modified gravity classes which are still in agreement with current data, i.e. f(R) models and coupled dark energy.
As noted already for example in [35, 36], we further remark that at present non-universal couplings are among the (few) classes of models which survive gravitational wave detection and local constraints (see  for a review on models surviving with a universal coupling). This is because, by construction, baryonic interactions are standard and satisfy solar system constraints; furthermore the speed of gravitational waves in these models is cT = 1 and therefore in agreement with gravitational wave detection. It has also been noted (see for example [37–39] and the update in ) that models in which a non-universal coupling between dark matter particles is considered would also solve the tension in the measurement of the Hubble parameter  due to the degeneracy beta - H0 first noted in Ref. .
Reference: L.Amendola, V.Pettorino "Beyond self-acceleration: force- and fluid-acceleration", Physics Letters B, in press, 2020.
DeepMass: The first Deep Learning reconstruction of dark matter maps from weak lensing observational data (DES SV weak lensing data)
This is the first reconstruction of dark matter maps from weak lensing observational data using deep learning. We train a convolution neural network (CNN) with a Unet based architecture on over 3.6 x 10^5 simulated data realisations with non-Gaussian shape noise and with cosmological parameters varying over a broad prior distribution. Our DeepMass method is substantially more accurate than existing mass-mapping methods. With a validation set of 8000 simulated DES SV data realisations, compared to Wiener filtering with a fixed power spectrum, the DeepMass method improved the mean-square-error (MSE) by 11 per cent. With N-body simulated MICE mock data, we show that Wiener filtering with the optimal known power spectrum still gives a worse MSE than our generalised method with no input cosmological parameters; we show that the improvement is driven by the non-linear structures in the convergence. With higher galaxy density in future weak lensing data unveiling more non-linear scales, it is likely that deep learning will be a leading approach for mass mapping with Euclid and LSST.
Reference 1: N. Jeffrey, F. Lanusse, O. Lahav, J.-L. Starck, "Learning dark matter map reconstructions from DES SV weak lensing data", Monthly Notices of the Royal Astronomical Society, in press, 2019.
The preliminary schedule can be found here:
Slides (password-protected) are on redmine.
The meeting starts on Monday 3 February at 9:30.
Please add your name to the following list if you intend to participate. To access IAP, external people are required to indicate their name in advance of the meeting, and might have to show identification at the IAP front desk. There is no conference fee.
Martin Kilbinger <email@example.com>
Sandrine Codis <firstname.lastname@example.org>
Space test of the Equivalence Principle: first results of the MICROSCOPE mission
|Authors:||P. Touboul, G. Metris, M. Rodrigues, Y. André, Q. Baghi, J. Bergé, D. Boulanger, S. Bremer, R. Chhun, B. Christophe, V. Cipolla, T. Damour, P. Danto, H. Dittus, P. Fayet, B. Foulon, P.-Y. Guidotti, E. Hardy, P.-A. Huynh, C. Lämmerzahl, V. Lebat, F. Liorzou, M. List, I. Panel, S. Pires, B. Pouilloux, P. Prieur, S. Reynaud, B. Rievers, A. Robert, H. Selig, L. Serron, T. Sumner, P. Viesser|
|Journal:||Classical and Quantum Gravity|
|Download:||ADS | arXiv | Fait Marquant|
The Weak Equivalence Principle (WEP), stating that two bodies of different compositions and/or mass fall at the same rate in a gravitational field (universality of free fall), is at the very foundation of General Relativity. The MICROSCOPE mission aims to test its validity to a precision of 10^-15, two orders of magnitude better than current on-ground tests, by using two masses of different compositions (titanium and platinum alloys) on a quasi-circular trajectory around the Earth. This is realised by measuring the accelerations inferred from the forces required to maintain the two masses exactly in the same orbit. Any significant difference between the measured accelerations, occurring at a defined frequency, would correspond to the detection of a violation of the WEP, or to the discovery of a tiny new type of force added to gravity. MICROSCOPE's first results show no hint for such a difference, expressed in terms of Eötvös parameter = [-1 +/- 9(stat) +/- 9 (syst)] x 10^-15 (both 1 uncertainties) for a titanium and platinum pair of materials. This result was obtained on a session with 120 orbital revolutions representing 7% of the current available data acquired during the whole mission. The quadratic combination of 1 uncertainties leads to a current limit on of about 1.3 x 10^-14.
EU-funded researchers have helped generate the most accurate map to date of dark matter, the mysterious substance that makes up 80 % of the universe. The innovative big-data technologies they used will have a significant impact on fields as diverse as astrophysics and biomedical imaging.