Cosmostat Day on Machine Learning in Astrophysics

Share this post on:

Date: March the 5th, 2021

Organizer:  Joana Frontera-Pons  <joana.frontera-pons@cea.fr>

Venue: Remote conference. Zoom link:https://esade.zoom.us/j/88535176160?pwd=RzU1cHA5Z0xrWXkyN0x1a2tJSHZ1Zz09


On March the 5th, 2021, we organize the 6th day on machine learning in astrophysics at DAp, CEA Saclay. 

Program:

All talks are taking place remotely

13:30 - 13:40h. Welcome message                                                   
13:40 - 14:20h. Data-driven detection of multi-messenger transientsIftach Sadeh (Deutsches Elektronen-Synchrotron)
14:20 - 15:00h. Deep Learning in Radio AstronomyVesna Lukic (Vrije Universiteit Brussel)   
15:00 - 15:40h. Machine Learning for Galaxy Image Reconstruction with Problem Specific Loss - Fadi Nammour (CosmoStat - CEA Saclay)   

15:40 - 16:00h. Coffee break with virtual croissants

16:00 - 16:40h. Anomaly detection with generative methodsColoma Ballester (Universitat Pompeu Fabra)
16:40 - 17:20h. Deep learning for environmental sciencesJan Dirk Wegner (ETH Zurich)
17:20 - 18:00h. Graph Neural NetworksFernando Gama ( University of California, Berkeley)

18:00 - 18:05h. End of the day


Data-driven detection of multi-messenger transients

Iftach Sadeh (Deutsches Elektronen-Synchrotron)

The primary challenge in the study of explosive astrophysical transients is their detection and characterisation using multiple messengers. For this purpose, we have developed a new data-driven discovery framework, based on deep learning. We demonstrate its use for searches involving neutrinos, optical supernovae, and gamma rays. We show that we can match or substantially improve upon the performance of state-of-the-art techniques, while significantly minimising the dependence on modelling and on instrument characterisation. Particularly, our approach is intended for near- and real-time analyses, which are essential for effective follow-up of detections. Our algorithm is designed to combine a range of instruments and types of input data, representing different messengers, physical regimes, and temporal scales. The methodology is optimised for agnostic searches of unexpected phenomena, and has the potential to substantially enhance their discovery prospects.


Deep Learning in Radio Astronomy

Vesna Lukic (Vrije Universiteit Brussel)

Machine learning techniques have proven to be increasingly useful in astronomical applications over the last few years, for example in image classification and time series analysis. A topic of current interest is the classification of radio galaxy morphologies, as it gives us insight into the nature of the Active Galactic Nuclei and structure formation. Future surveys such as the Square Kilometre Array (SKA), will detect many million sources and will require the use of automated techniques. Convolutional neural networks are a machine learning technique that have been very successful in image classification, due to their ability to capture high-dimensional features in the data. We show the performance of simple convolutional network architectures in classifying radio sources from the Radio Galaxy Zoo. The use of pooling in such networks results in information losses which adversely affect the classification performance, however Capsule networks preserve this information with the use of dynamic routing. We explore a couple of convolutional neural network architectures against variations of Capsule network setups and evaluate their performance in replicating the classifications of radio galaxies detected by the Low Frequency Array (LOFAR). Finally, we also show how it is possible to use convolutional neural networks to find sources in radio surveys.


Machine Learning for Galaxy Image Reconstruction with Problem Specific Loss

Fadi Nammour (CosmoStat - CEA Saclay)

Telescope images are corrupted with blur and noise. Generally, blur is represented by a convolution with a Point Spread Function and noise is modelled as Additive Gaussian Noise. Restoring galaxy images from the observations is an inverse problem that is ill-posed and specifically ill-conditioned. The majority of the standard reconstruction methods minimise the Mean Square Error to reconstruct images, without any guarantee that the shape objects contained in the data (e.g. galaxies) is preserved. Here we introduce a shape constraint, exhibit its properties and show how it preserves galaxy shapes when combined to Machine Learning reconstruction algorithms.


Anomaly detection with generative methods

Coloma Ballester (Universitat Pompeu Fabra)

Anomaly detection is frequently approached as out-of-distribution or outlier detection. In this talk, a method for out-of-distribution will be discussed. It leverages the learning of the probability distribution of normal data through generative adversarial networks while simultaneously keeping track of the states of the learning to finally estimate an efficient anomaly detector.


Deep learning for environmental sciences

 Jan Dirk Wegner (ETH Zurich) 

A multitude of different sensors is capturing massive amounts of geo-coded data with different spatial resolution, temporal frequency, viewpoint, and quality every day. Modelling functional relationships for applications is often hard and loses predictive power due to the high variance in sensor modality. Data-driven approaches, especially modern deep learning, come to the rescue and learn expressive models directly from (labeled) input data. In this talk, I will present deep learning methods to analyze geospatial data at large scale for two specific applications in the environmental sciences: biodiversity estimation and global vegetation height mapping.


Graph Neural Networks

Fernando Gama ( University of California, Berkeley)

Graphs are generic models of signal structure that can help to learn in several practical problems. To learn from graph data, we need scalable architectures that can be trained on moderate dataset sizes and that can be implemented distributedly. In this talk, I will draw from graph signal processing to define graph convolutions, and use them to introduce graph neural networks (GNNs). I will prove that GNNs are permutation equivariant and stable to perturbations of the graph, properties that explain their scalability and transferability. I will also use these results to explain the advantages of GNNs over linear graph filters. I will then discuss the problem of learning decentralized controllers, and how GNNs naturally leverage the partial information structure inherent to distributed systems. Using flocking as an illustrative example, I will show that GNNs, not only successfully learn distributed actions that coordinate the team but also transfer and scale to larger teams.


 Previous Cosmostat Days on Machine Learning in Astrophysics :
Share this post on: