WORKSHOP ON COMPUTATIONAL INTELLIGENCE IN REMOTE SENSING AND ASTROPHYSICS

The workshop on Computational Intelligence in Remote Sensing and Astrophysics (CIRSA) aims at bringing together researchers from the environmental sciences, astrophysics and computer science communities in an effort to understand the potential and pitfalls of novel computational intelligence paradigms including machine learning and large-scale data processing.

 

 

EuroPython 2019

Date: July 8-14 2019

Venue: Basel, CH

Website: https://ep2019.europython.eu/

Blog: http://blog.europython.eu/

Twitter: @europython

Conference App will be announced on the blog.

EuroPython is an annual conference hosting ~1200 participants from academia and companies, interested in development and applications of python programming language. It's also a good opportunity for students and postdocs who wish to find a job outside academia.

For more info, contact: Valeria Pettorino


 

 

A Distributed Learning Architecture for Scientific Imaging Problems

 

Authors: A. Panousopoulou, S. Farrens, K. Fotiadou, A. Woiselle, G. Tsagkatakis, J-L. Starck,  P. Tsakalides
Journal: arXiv
Year: 2018
Download: ADS | arXiv

Abstract

Current trends in scientific imaging are challenged by the emerging need of integrating sophisticated machine learning with Big Data analytics platforms. This work proposes an in-memory distributed learning architecture for enabling sophisticated learning and optimization techniques on scientific imaging problems, which are characterized by the combination of variant information from different origins. We apply the resulting, Spark-compliant, architecture on two emerging use cases from the scientific imaging domain, namely: (a) the space variant deconvolution of galaxy imaging surveys (astrophysics), (b) the super-resolution based on coupled dictionary training (remote sensing). We conduct evaluation studies considering relevant datasets, and the results report at least 60\% improvement in time response against the conventional computing solutions. Ultimately, the offered discussion provides useful practical insights on the impact of key Spark tuning parameters on the speedup achieved, and the memory/disk footprint.

DEDALE: Mathematical Tools to Help Navigate the Big Data Maze

Managing the huge volumes and varying streams of Big Data digital information presents formidable analytical challenges to anyone wanting to make sense of it. Consider the mapping of space, where scientists collect, process and transmit giga-scale data sets to generate accurate visual representations of millions of galaxies. Or consider the vast information being generated by genomics and bioinformatics as genomes are mapped and new drugs discovered. And soon the Internet of Things will bring millions of interconnected information-sensing and transmitting devices.

Improving Weak Lensing Mass Map Reconstructions using Gaussian and Sparsity Priors: Application to DES SV

 

Authors: N. JeffreyF. B. AbdallaO. LahavF. LanusseJ.-L. Starck, et al
Journal:  
Year: 01/2018
Download: ADS| Arxiv

Abstract

Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in the density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE.

Big Bang and Big Data

The new international projects, such as the Euclid space telescope, are ushering in the era of Big Data for cosmologists. Our questions about dark matter and dark energy, which on their own account for 95% of the content of our Universe, throw up new algorithmic, computational and theoretical challenges. The fourth concerns reproducible research, a fundamental concept for the verification and credibility of the published results.

Astrophysique et IRM, un mariage qui a du sens

La Direction de la recherche fondamentale au CEA lance le projet COSMIC, né du rapprochement de deux compétences en traitement des données localisées à l'Institut des sciences du vivant Frédéric-Joliot (NeuroSpin) et au CEA-Irfu (CosmoStat). Les mécanismes d'acquisition de données en radio-astronomie et en IRM présentent des similarités. Les modèles mathématiques utilisés sont en effet basés sur les principes de parcimonie et d'acquisition comprimée, dérivés de l'analyse harmonique.