## Apr 06, 2017

**Speaker**

Julie Josse (Ecole Polytechnique)

**Title**

Missing data imputation using principal component methods

**Abstract**

Missing values are ubiquitous and can occur for plenty of reasons: machines that fail, survey participants who do not answer to all questions, etc. The problem of missing values is somehow exacerbated with the amount of available data: data are often multisources (several projects aim to build large repositories by compiling data from preexisting databases) and due to the wide heterogeneity of measurement methods and research objectives, these large databases often exhibit extraordinarily high number of missing values. Missing values are problematic since most statistical methods can not be applied directly on a incomplete data.

In this talk, I will present recent tools developed to handle, in a practicable way, missing values. Among them, we can note the treatment of heterogeneous data (both quantitative and categorical) as well as the possibility to go far beyond single estimations and suggest subtle ways of assessing uncertainties. I will discuss imputation methods based on (regularized) singular value decomposition that caught the attention of the community due to their ability to handle large matrices with large amount of missing entries. Then, I will show how to extend these methods to multiple imputation to get notions of confidence intervals to know which credit should be given to analyses obtained from an incomplete data set. Such multiple imputation methods also offer new ways to visualize the variability of the results due to missing values.

## Mar 30, 2017

**Speaker**

Matthieu Simeoni (EPFL)

**Title**

Bluebild: a Stable, Accurate and Efficient Imager for Radio Astronomy

**Abstract**

Radio astronomy imaging has been primarily focused on a planar approximation to a portion of the observed sphere, producing images of a fixed resolution. Historically, Fourier analysis played a pivotal role, with algorithmic modifications made to fit that paradigm. It was thought that spherical calculation was computationally impractical, and that inherent numerical instability meant only a dirty image (a very rough least-squares approximation) could be made. The computational and energy demands of instruments such as the planned Square Kilometre Array (SKA) have made a new approach imperative.

Here we present an efficient algorithm called Bluebild, that reconstructs directly on the celestial sphere, producing, for the first time, a true least-square estimate of the sky. Wide-field and flexible beamformed imaging follow naturally. It produces a continuous image description that may be stored independently of resolution, and sampled up to the fundamental telescope limit. A multi-scale sky decomposition becomes an intrinsic part of the process, and algorithmic linearity permits uncertainty assessment across the chain. The algorithm is fast, far simpler and more intuitive than previous methods. We show sky images produced are more accurate, and can be analysed in much more depth. Results with real LOFAR data will be presented.

## Mar 27, 2017

**Speaker**

Francois Meyer (University of Colorado, Boulder/INRIA)

**Title**

Detecting Structural Changes in Dynamic Community Networks

**Abstract**

The study of time-varying (dynamic) networks (graphs) is of fundamental importance for computer network analytics. Several methods have been proposed to detect the effect of significant structural changes in a time series of graphs.

The main contribution of this work is a detailed analysis of a dynamic community graph. This model is formed by adding new vertices, and randomly attaching them to the existing nodes. The goal of the work is to detect the time at which the graph dynamics switches from a normal evolution -- where balanced communities grow at the same rate -- to an abnormal behavior -- where communities start merging.

In order to circumvent the problem of identifying the communities, we use a metric to quantify structural changes as a function of time. The detection of anomalies becomes one of testing the hypothesis that the graph is undergoing a significant structural change.

This is work in collaboration with Peter Wills.

## Feb 17, 2017

**Speaker**

Vivien Scottez (IAP)

**Title**

Clustering-based Redshift estimation

**Abstract**

The clustering of galaxies has emerged as a powerful way to estimate the redshift distribution of a sample. I will briefly review how we can get access to redshift information from the crosscorrelation function. Then, using the MICE2 simulation I will present how we can get individual redshifts for each galaxy as well as the corresponding accuracy.

## Feb 2, 2017

**Speaker**

Virginie Ollier (ENS Cachan/Supélec)

**Title**

Robust Calibration of Radio Interferometers in Non-Gaussian Environment [slides]

**Abstract**

The development of new phased array systems in radio astronomy, as the low frequency array (LOFAR) and the square kilometre array (SKA), formed of a large number of small and flexible elementary antennas, has led to significant challenges. Among them, calibration is a crucial step in order to provide meaningful high dynamic range images and is commonly performed under the assumption of Gaussianity of the noise.

Nevertheless, observations in the context of radio astronomy are known to be affected by the presence of outliers which are due to several causes, e.g., weak non-calibrator sources or man made radio frequency interferences. In order to take into account the outlier effects, the noise can be assumed to follow a spherically invariant random distribution.

Based on this modeling, a robust calibration algorithm is exposed in this presentation. More precisely, this new scheme is based on the design of an iterative relaxed concentrated maximum likelihood estimation procedure and allows to obtain closed-form expressions for the unknown parameters with a reasonable computational cost.

## Dec 1, 2016

**Speaker**

Emille Ishida (Université Blaise Pascal)

**Title**

The Cosmostatistics Initiative (COIN) - reshaping scientific interdisciplinary collaborations [slides]

**Abstract**

The Cosmostatistics Initiative (COIN) is an international working group built under the auspices of the International Astrostatistics Association (IAA). Its goal is to propose an alternative approach to scientific collaboration while contributing to the establishment of Astrostatistics as a discipline on its own. In this talk I will describe the motivation and the logistic behind the COIN collaboration, and its Residence Programs, and provide a couple of examples on how interdisciplinary collaboration can shed light on long standing astronomical problems.

## Oct 20, 2016

**Speaker**

Arnau Pujol (CosmoStat)

**Title**

Journal Club: A study of the sensitivity of shape measurements to the input parameters of weak lensing image simulations

## May 19, 2016

**Speaker**

Carmelita Carbone (INAF)

**Title**

Clustering, lensing, and ISW/Rees-Sciama from the DEMNUni neutrino simulations

**Abstract**

I will present the first set of cosmological simulations produced within the "Dark Energy and Massive Neutrino Universe" (DEMNUni) project. These simulations are characterized by = 2 Gpc, = 2 2048, a baseline CDM-Planck cosmology, and four different total neutrino masses, = 0, 0.17, 0.3, 0.53 eV, with a degenerate mass spectrum. They represent one of the largest set of -body simulations with a massive neutrino component treated as an additional particle type. I will present fully non-linear effects in the presence of massive neutrinos, extracted from the DEMNUni simulations, and show how neutrino free-streaming alters not only LSS clustering and lensing, but introduces also an excess of power in the ISW/RS signals, and related cross-correlations, at intermediate scales.

## May 17, 2016

**Speaker**

Ming Jiang (CosmoStat)

**Title**

Joint Multichannel Deconvolution and Blind Source Separation

## May 12, 2016

**Speaker**

Julian Adamek (Paris Observatory)

**Title**

General Relativity and Cosmic Structure Formation

**Abstract**

The Newtonian approximation which is usually invoked in -body simulations of cosmic large scale structure relies on the assumptions that gravitational fields are weak and that they are only sourced by nonrelativistic matter. The latter constitutes an implicit assumption about the nature of the "dark" components of the Universe (dark matter and dark energy), thereby precluding a serious assessment of some potentially interesting models. I will present the first -body simulations of cosmic structure formation based on a weak field approximation to General Relativity, taking into account all six degrees of freedom of the metric. The geodesic equations are solved consistently also for relativistic particles, such as massive neutrinos. The approach is very general and can be applied to various settings where the Newtonian approximation would be unreliable or inconsistent, for instance some models of dynamical dark energy or modified gravity.

## Apr 28, 2016

**Speaker**

Aurélien Benoit-Lévy (IAP)

**Title**

News from the Dark Energy Survey (DES) [slides]

**Abstract**

The Dark Energy Survey (DES) is a galaxy photometric survey designed to study the properties of the Dark Energy using four main cosmological probes: galaxy clustering on large scales, weak gravitational lensing, galaxy-cluster abundance, and supernova distances. During the northern fall of 2012 the DES collaboration installed and commissioned DECam, a 570 mega-pixel optical and near-infrared camera with a large 3 sq. deg. field of view, set at the prime focus of the Víctor M. Blanco 4-meter telescope in CTIO, Chile. A "Science Verification" (SV) period of observations, lasting until late February 2013, followed the DECam commissioning phase, and provided science-quality images for almost 200 sq. deg. at the nominal depth of the survey. The survey is now finishing its third year of observation. At the end of the five seasons, DES will have mapped an entire octant of the southern sky to unprecedented depth, measuring the position on the sky, redshift and shape of almost 300 million galaxies, together with thousands of galaxy clusters and supernovae. In this talk, I will present the current status of the project and the first scientific results of the survey, based on the Science Verification data, with a focus on the cross-correlations between the DES data and the CMB observations by Planck and SPT.

## Apr 14, 2016

**Speaker**

Samuel Farrens (CosmoStat)

**Title**

Friends-of-friends algorithm for optical cluster detection

## Feb 3, 2016

**Speaker**

Jérémy Rapin (CardioLogs)

**Title**

Deep learning with electrocardiograms

**Abstract**

CardioLogs is a start-up which was created to facilitate the diagnosis and management of patients with cardiovascular diseases using artificial intelligence. This is made possible by the analysis of electrocardiograms (ECGs), which are similar to multispectral signals. After a quick description of what is observable on an ECG, we will show how deep learning, and more specifically deep convolutional neural networks, helped us design algorithms which are much more robust and efficient than state-of-the-art methods.

## Jan 28, 2016

**Speaker**

Philippe Ciuciu (NeuroSpin, CEA Saclay)

**Title**

Compressed Sensing along physically plausible -space trajectories in MRI

**Abstract**

Magnetic Resonance Imaging (MRI) is a non-invasive and non-ionizing imaging technique that provides images of body tissues, using the contrast sensitivity coming from the magnetic parameters (, and proton density). Data are acquired in the -space, corresponding to spatial Fourier frequencies. Because of physical constraints, the displacement in the -space is subject to kinematic constraints. Indeed, magnetic field gradients and their temporal derivative are upper bounded. Hence, the scanning time increases with the image resolution.

Decreasing scanning time is critical to improve patient comfort, reduce exam costs, limit tissue heating and image distortion, and improve spatial or temporal resolution in anatomical or functional MRI, respectively. Reducing scanning time can be addressed by Compressed Sensing (CS) theory. The latter is a technique that guarantees the perfect recovery of an image from under-sampled data in -space, by typically assuming that the image is sparse in a wavelet basis.

Unfortunately, CS theory cannot be directly cast to the MRI setting. The reasons are twofold: i) acquisition (Fourier) and representation (wavelets) bases are coherent and ii) sampling schemes obtained using CS theorems are composed of isolated measurements and cannot be realistically implemented by magnetic field gradients: the sampling is usually performed along continuous or more regular curves. However, heuristic application of CS in MRI has provided promising results.

In this talk, I will present the theoretical tools we have developed in Nicolas Chauffert's PhD thesis to apply CS to MRI and other modalities. On the one hand, we theoretically justify why variable density sampling is the right answer the first impediment. The more the sample contains information, the more it is likely to be drawn. On the other hand, we propose sampling schemes and design sampling trajectories that fulfill acquisition constraints, while traversing the -space with the sampling density advocated by the theory.

The reconstruction results obtained in simulations using this strategy outperform existing acquisition trajectories (spiral, radial) by about 3 dB. They permit to envision a very near implementation on a real 7 T scanner at NeuroSpin, notably in the context of high resolution anatomical imaging.

## Jan 21, 2016

**Speaker**

Stéphane Plaszczynski (LAL, Orsay)

**Title**

Relieving tensions in Planck likelihoods [slides]

**Abstract**

The recently released Planck data show the intriguing feature of presenting "too much" lensing in their power spectra. This is parametrized by the so-called "" parameter that should be compatible with 1.

After reviewing the path(s) from building a Planck likelihood function to estimating cosmological parameters, I will show that the deviation using the Planck default likelihoods is 2.6 and will connect this "tension" to another one on the reionization optical depth determined from Planck likelihoods in different multipole () ranges. Using another Planck high- based likelihood ("Hillipop") the tension on the reionization depth is reduced but is still discrepant with 1 by 2.2.

I will then show how using the "very-high-" part of the CMB spectra from the ACT and SPT experiments allows to cure all the raised issues () and will present various systematic checks before making an update on CDM cosmological parameters.

More details in [1510.07600].

## Dec 11, 2015

**Speaker**

Zoltán Haiman (Columbia University)

**Title**

Cosmological Information from Non-Linear Weak Lensing

**Abstract**

Weak gravitational lensing by large scale structure is a promising tool to probe cosmology. Several large astronomical surveys have either been proposed or are underway to measure weak lensing distortions of up to a billion galaxies. While cosmology is traditionally extracted from measuring two-point functions of the lensing maps, these statistics capture all the information only for a Gaussian field. The lensing field in the non-linear regime is strongly non-Gaussian, and non-Gaussian statistics could therefore provide significant additional information.

I will report on results from a large suite of ray-tracing -body simulations in different cosmologies, and discuss the expected constraints from the number counts of lensing peaks (i.e. from the number of maxima as a function of their height), and from other statistics probing the non-linear regime. These statistics can tighten cosmological constraints by a factor of two, compared to using the power spectrum alone. A recent application of this approach to the CFHTLenS survey has confirmed this for the parameters (, ). I will also argue that the numerous low signal-to-noise peaks contain most of the cosmological information, and are relatively insensitive to baryon effects. Finally, I will comment on the possible use of non-Gaussianities in CMB lensing for cosmology, and on some theoretical simulation challenges for large future lensing surveys.

## Nov 12, 2015

**Speaker**

Jean-Yves Ottmann (Université Paris-Dauphine)

**Title** (French)

Bien-être et mal-être au travail dans les métiers de laboratoire : le cas du CEA [slides]

**Abstract** (French)

Le bien-être et le mal-être au travail sont des concepts d’actualité, qui recoupent de nombreux modèles et théories issus de disciplines et d’épistémologies différentes. On peut questionner la pertinence de ces modèles pour comprendre le rapport au travail de professions intellectuelles, d’activités d’expertises ou de travailleurs de la connaissance.

En partant d’une synthèse des littératures qui existent sur ces sujets, cette thèse étudie le rapport au travail des métiers de laboratoire en sciences dures. Ce travail est une démarche qualitative, compréhensive et interprétativiste, basée sur l’étude de cas multiples enchâssés de quatre laboratoires du Commissariat à l’Énergie Atomique et aux Énergies Alternatives (CEA).

Cette recherche montre que les différents statuts présents dans les laboratoires partagent les mêmes sources d’engagement et de bien-être mais ont des facteurs de mal-être différents. De plus, ils présentent tous des « rapports ambigus au travail », conjonction simultanée de bien-être et de mal-être, qui obligent à repenser l’articulation entre bien-être et mal-être au travail.

## Oct 29, 2015

**Speaker**

Fred Ngolè (CosmoStat)

**Title**

Matrix factorization for PSF's field restoration (Rehearsals for Euclid WL Autumn Meeting in London)

## Oct 15, 2015

**Speaker**

Fabien Lacasa (Institut d'Astrophysique spatiale, IAS)

**Title**

Combining cluster counts and galaxy angular power spectrum

**Abstract**

Thanks to its 5-year observation of the southern sky (2014-2019), the Dark Energy Survey will enable unprecedented studies of galaxy clustering and cluster constraints on cosmology and galaxy evolution. I will show ongoing work to combine these two probes to constrain vanilla cosmology, dark energy and Halo Occupation Distribution (HOD) parameters. The halo model can be used to model the cross covariance between cluster counts and galaxy power spectrum (or real-space correlation function), and I will introduce a diagrammatic method to compute easily the different terms implied and have a simple representation. I will show the importance of using a non-linear model for predictions, and that the cross-covariance is particularly important at low redshifts. A Fisher analysis will prove the nice complementarity and synergy of the two probes for cosmological and HOD parameters. If time permits, I will also show the development of a joint likelihood using the Gram-Charlier series and respecting the Poissonian character of cluster counts.

## Sep 29, 2015

**Speaker**

Pierre Weiss and Paul Escande (Institut de Mathématiques de Toulouse)

**Title** (French)

Sur la décomposition d'opérateurs de flou sur des bases d'ondelettes

**Abstract** (French)

Dans cet exposé, nous commencerons par établir plusieurs propriétés des opérateurs de flou (convolutions ou plus généralement, opérateurs intégraux régularisants) lorsqu'ils sont exprimés dans des bases d'ondelettes. Ce type de résultat semble avoir été la motivation initiale de l'introduction des ondelettes par Yves Meyer. De façon surprenante, il ont été extrêmement peu utilisés en imagerie, malgré le succès des ondelettes dans ce domaine. Nous montrerons alors que ces résultats théoriques peuvent avoir un impact fort sur la résolution pratique de problèmes inverses de défloutage. A titre d'exemple, nous montrerons qu'ils permettent de gagner de un à deux ordres de grandeurs sur les temps de déconvolution avec régularisation l1 sur des coefficients d'ondelettes (bases ou tight frames).

## Sep 17th 2015

**Speaker**

Julien Girard (CosmoStat)

**Title**

Facing current and future calibration/imaging challenges of the Square Kilometre Array (SKA) [slides]

**Abstract**

After an brief introduction to the science drivers and techniques of the next generation radio telescope Square Kilometre Array (SKA), I will attempt to give a glimpse into the current imaging and calibration issues that still need to be tackled to achieve full-polarization, multi-frequency, high-dynamic range in short exposure time over a wide field of view. Currently, a lot of effort is being deployed in trying to develop economic approaches which will lower the high demands from this giant facility. Hopefully, the development and use of convex optimization methods, sparsity (e.g. Garsden et al. 2015, Girard et al. 2015) and dictionary learning combined with the "Measurement Equation" framework (Smirnov, 2011, Tasse, 2014) in radio interferometry, might contribute to face these challenges.

## Sep 10, 2015

**Speaker**

Austin Peel (CosmoStat)

**Title**

Investigating cosmological applications of the inhomogeneous Szekeres models [slides]

**Abstract**

Exact solutions of Einstein’s field equations that can describe evolving complex structures in the universe provide complementary frameworks to standard perturbation theory in which to analyze cosmological and astrophysical phenomena. I plan to present an overview of my thesis work, where I explored cosmological applications of a certain family of non-symmetric spacetimes called Szekeres models. The Szekeres metric, a well studied exact solution of general relativity, is inhomogeneous and anisotropic in general but contains the smooth FLRW metric as a special limiting case. It is therefore useful in studies where structures evolve within a CDM background but still retain the full nonlinearity of general relativity. My work focused on Szekeres models in two primary contexts that I will discuss: the growth rate of large scale structures and also light propagation in a Swiss-cheese model of the universe.