Meeting in Mathematical Statistics
December 18 – 22, 2017

Scientific & Organizing Committee

Arnak Dalalyan (ENSAE ParisTech)
Richard Nickl (University of Cambridge)
Christophe Pouet (Ecole Centrale de Marseille)

Statistics is the science of data. It is essential to establish clear and general theoretical frameworks in order to study the problems considered by the statisticians. Moreover in order to assess the optimality of the proposed statistical procedures, general theorems must be established and in this task statisticians need powerful mathematical tools. These tools can be applied in very different contexts.

The Meeting in Mathematical Statistics 2017 has two main goals :
– to question the theoretical frameworks usually used by statisticians to study their problems: minimax approach, bayesian statistics, asymptotic versus non asymptotic results, adaptation, oracle inequalities,
– to study in depth some of the tools used to solve recent problems in statistics and help to make the techniques used in the inventive proofs available to a large audience of statisticians.

Three scientific themes will be discussed in details:

  1. Recent development in measure concentration
  2. Frequentist properties of the bayesian procedures for non-parametric problems and in high dimension
  3. Lower bounds for risk and complexity of the statistical methods.

During this event, we will also take the opportunity to celebrate the 60th birthday of two very influential statisticians, namely Professor Oleg Lepski (I2M, Aix-Marseille Université, organizer of the Meeting in Mathematical Statistics for ten years) and Professor Alexandre Tsybakov (LPMA-Université Pierre et Marie Curie, CREST-ENSAE et Ecole polytechnique). Both have had a large influence on the development of mathematical statistics for the past twenty years.


Felix Abramovich (Tel Aviv University)   Sparse logistic regression: model selection, goodness-of-t and classication   (pdf)
Pierre Alquier (ENSAE ParisTech)   Concentration of tempered posteriors and of their variational approximations   (pdf)
Pierre C. Bellec (Rutgers University)    How to generalize bias and variance to convex regularized estimators ?
Alexandre Belloni (Duke University)    Subvector Inference in Partially Identified Models with Many Moment Inequalities​    (pdf)
Quentin Berthet (University of Cambridge)    Link prediction with Matrix Logistic Regression
Cristina Butucea (CREST, ENSAE)   Estimation of linear functionals in inverse problems with unknown operator
Natalia Bochkina (University of Edinburgh)   Rates of convergence in nonparametric problems with heterogeneous variance
Rui M. Castro (Eindhoven University of Technology)   Are there needles in a moving haystack? Adaptive sensing for detection of dynamically evolving signals
Alexander Goldenshluger (University of Haifa)  Density estimation from observations with multiplicative measurement errors   (pdf)
Yuri Golubev (Aix-Marseille Université)   On multi-channel signal detection   (pdf)
Marc Hoffmann (Université Paris Dauphine)   The work of Oleg Lepski: beyond a « discourse on method »   (pdf)
Ildar Ibragimov (Steklov Institute of Mathematics)   Estimation of functions depending on a parameter observed in Gaussian noise
Iain Johnstone (Stanford University)   Eigenvalues and Variance Components   (pdf)
Anatoli Juditski (Université Grenoble Alpes)  Estimate aggregation from indirect observations   (pdf)
Vladimir Koltchinskii (Georgia Institute of Technology Efficient Estimation of Smooth Functionals of High-Dimensional Covariance    (pdf)
Ekaterina Krymova (Universität of Duisburg-Essen)   On estimation of noise variance in high-dimensional linear models   (pdf)
Guillaumle Lecué (CREST-ENSAE)    Learning from MOM’s principles   (pdf)
Enno Mammen (Heidelberg University)   Statistical Inference in Sparse High-Dimensional Nonparametric Models   (pdf)
Stanislav Minsker (University of Southern California) Robust modifications of U-statistics and estimation of the covariance structure of heavy-tailed distributions (pdf)
Axel Munk (University of Göttingen)   Statistical inference for Wasserstein transport 
Richard Nickl (University of Cambridge)     Efficient nonparametric statistical inference for a non-linear inverse problems with the Schrödinger equation  (pdf)
Marianna Pensky (University of Central Florida)   Estimation and Clustering in the Dynamic Stochastic Block Model  (pdf)
Dominique Picard (Université Paris 7)   Smooth Clustering fo high dimensional data  (pdf)
Massimilliano Pontil (Istituto Italiano di Tecnologia, Genova)   Consistent Multitask Learning with Nonlinear Output Constraints   (pdf)
Maxim Raginsky (University of Illinois Urbana-Champaign)   Compositional properties of statistical decision procedures: an information-theoretic view
Alexander Rakhlin (University of Pennsylvania)   Online Prediction: Rademacher Averages via Burkholder’s Method   (pdf)
Markus Reiss (Humboldt Universität zu Berlin)    Adaptivity of early stopping for PLS / CG1    (pdf)
Philippe Rigollet (MIT)    A biased random walk through Sasha Tsybakov’s work
Angelika Rohde (Albert-Ludwigs-Universität Freiburg)    Locally adaptive confidence bands    (pdf)
Vladimir Spokoiny (WIAS & Humboldt Universität zu Berlin)   Big ball probabilty with applications   (pdf)
Natalia Stepanova (Carleton University)  On application of weighted Kolmogorov-Smirnov statistics to the problems of classification, signal detection, and estimation in sparse models
Sara van de Geer (ETH Zürich)   Sharp oracle inequalities for non-convex loss   (pdf)
Nicolas Verzelen (INRA Montpellier)    Estimating Mean Functionals in the Gaussian Vector Model    (pdf)
Marten Wegkamp (Cornell University)   Overlapping Variable Clustering with Statistical Guarantees   (pdf)
​Harrison Zhou (Yale University)     Three Siblings: EM, Variational Inference, and Gibbs Sampling   (pdf)