MULTIYEAR PROGRAM
CONFERENCE

Meeting in Mathematical Statistics   /   Rencontres de Statistique Mathématique
Machine learning and nonparametric statistics

13 – 17 December 2021

Scientific Committee & Organizing Committee
Comité scientifique & Comité d’organisation

Cristina Butucea (Université Paris-Est Marne-la-Vallée)
Stanislav Minsker (University of Southern California)
Christophe Pouet  (École Centrale de Marseille)
Vladimir Spokoiny (Humboldt University of Berlin)

Description
Contemporary machine learning algorithms define the state of the art in diverse areas (computer vision, robotics and speech recognition, to name a few), but in many cases theoretical justification behind the success of these methods is still missing. Mathematical results, in particular statistical and probabilistic properties, are being actively developed, but many challenges still remain. Deep learning and generative models are particular examples of the areas with significant gaps between the engineering success and theoretical understanding. To fill this gap, tools from diverse areas such as nonparametric statistics, approximation theory, empirical process theory and computational efficiency are needed.

This conference aims at establishing new fruitful collaborations among the experts in nonparametric statistics and theoretical computer science. Expected outcome of such collaborations are new developments in the theory of machine learning, including the topics such as deep learning, robustness, privacy and estimation under fairness constraints.

Lectures

Peter Bartlett (UC Berkeley)   Benign overfitting and adversarial examples  (abstract)
Gabor Lugosi (Pompeu Fabra University, Barcelona)  Network archeology: a few results and questions  (abstract)

Talks

Arya Akhavan (Iit – Ensae)   Distributed Zero-Order Optimization under Adversarial Noise
Randolf Altmeyer (University of Cambridge)   Statistical and computational guarantees for sampling from high dimensional posterior distributions
Denis Belomestny (University Of Duisburg)   Rates of convergence for density estimation with generative adversarial networks
Annika Betken (University Of Twente)  Combining rank statistics and subsampling for a solution to the change-point problem in time series analysis
Gilles Blanchard (Université Paris-Saclay)   Fast rates for prediction with limited expert advice
Timothy Cannings (University Of Edinburgh)   Adaptive Transfer Learning
Arnak Dalalyan (Crest-Ensae)   Statistical guarantees for generative models
Farida Enikeeva (Université De Poitiers)   Change-Point Detection in Dynamic Networks with Missing Links
Subhodh Kotekal (University Of Chicago)   Minimax rates for sparse signal detection under correlation
Matthias Löffler (Eth Zürich)   AdaBoost and robust one-bit compressed sensing
Béatrice Laurent-Bonneau (Insa De Toulouse)   Aggregated tests of independence based on HSIC measures
Tengyuan Liang (University Of Chicago)   Universal Prediction Band, Semi-Definite Programming and Variance Interpolation
Arshak Minasyan (Crest-Ensae)   All-In-One Robust Estimator of the Gaussian Mean 
Mohamed Ndaoud (Essec)  Minimax Supervised Clustering in the Anisotropic Gaussian Mixture Model: A new take on Robust Interpolation
Vianney Perchet (Ensae & Criteo AI Lab)   Active learning and/or online sign identification
Kolyan Ray (Imperial College London)   Bayesian inference for multi-dimensional diffusions
Markus Reiß (Humboldt University Berlin)   Inference on the maximal rank of time-varying covariance matrices using high-frequency data
Lionel Riou-Durand (University Of Warwick)   Metropolis Adjusted Underdamped Langevin Trajectories
Etienne Roquain (Sorbonne Université)   Some transition boundaries for multiple testing with unknown null distribution
Richard Samworth (University Of Cambridge)   Optimal subgroup selection
George Stepaniants (Massachusetts Institute Of Technology Learning)   Partial Differential Equations in Reproducing Kernel Hilbert Spaces
Botond Tibor Szabo (Bocconi University) Optimal distributed testing under communication constraints in high-dimensional and nonparametric Gaussian
white noise model

Mathias Trabs (Karlsruhe Institute Of Technology)   Dispersal density estimation across scales
Nikita Zhivotovskiy (Eth)   Stability and Generalization: Some recent results

 

 

SPONSORS

Picture

PROJET ANR-17-CE40-0003 HIDITSA