CONFERENCE

AofA: Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms
AofA: méthodes probabilistes, combinatoires et asymptotiques pour l’analyse d’algorithmes

24 – 28 June 2019

​AofA’19 will be the 30th International Meeting on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms
Scientific Committee
Comité scientifique

Nicolas Broutin (ISorbonne Université)
Luc Devroye (McGill University)
Cécilia Holmgren (Uppsala University)
Hsien-Kuei Hwang (Academia Sinica, Taipei)
Malwina Luczak (Queen Mary University of London)

Organizing Committee
Comité d’organisation

Frédérique Bassino (Université Paris 13)
Conrado Martinez (UPC Barcelona)
Bruno Salvy (INRIA ENS Lyon)

Description
Analysis of Algorithms (AofA) is a field at the boundary of computer science and mathematics. The goal is to obtain a precise understanding of the asymptotic, average-case characteristics of algorithms and data structures. A unifying theme is the use of probabilistic, combinatorial, and analytic methods. This discipline is often traced back to Donald E. Knuth, whose fundamental books, “The Art of Computer Programming”, established ties between areas of study that include discrete mathematics, combinatorics, probability theory, analytic number theory, asymptotic analysis, and complexity theory. Starting in 1993, notably thanks to the efforts of Philippe Flajolet, a series of international workshops and conferences entirely devoted to the Analysis of Algorithms has been established. These are attended by the major actors of the field. Nowadays, these yearly conferences take place on a biennial schedule: conferences with submissions of articles in even-numbered years and more informal workshops by invitation in odd-numbered years as this 2019 meeting. The purpose of the workshop is to stimulate research in the field by providing a regular forum for researchers to come and exchange ideas and by attracting prominent people from related areas to the field.
L’Analyse d’algorithmes (AofA) est un domaine à l’interface entre les mathématiques et l’informatique. Son but est d’obtenir une compréhension précise du comportement moyen, asymptotique, de caractéristiques d’algorithmes et de structures de données. Un thème unificateur est l’usage de méthodes probabilistes, combinatoires et analytiques. La naissance de la discipline est souvent attribuée à Donald E. Knuth, dont l’ouvrage fondamental “The Art of Computer Programming”, a établi des liens entre des domaines de recherche comprenant les mathématiques discrètes, la combinatoire, la théorie des probabilités, la théorie analytique des nombres, l’analyse asymptotique et la théorie de la complexité. À partir de 1993, et notamment grâce aux efforts de Philippe Flajolet, une série de conférences et de colloques entièrement dédiés à l’analyse d’algorithmes a été lancée. Y participent les principaux acteurs du domaine. Actuellement, ces conférences annuelles sont organisées sur un rythme bi-annuel : des conférences avec soumission d’articles les années paires, et des colloques sur invitation les années impaires comme cette édition 2019. L’objectif du colloque est de stimuler la recherche dans le domaine en fournissant un forum régulier auquel les chercheurs viennent échanger leurs idées et en attirant des figures majeures de domaines connexes.
Speaker

Alin Bostan (INRIA Saclay Île-de-France)  Computer algebra for lattice path combinatorics
Classifying lattice walks in restricted lattices is an important problem in enumerative combinatorics. Recently, computer algebra has been used to explore and to solve a number of difficult questions related to lattice walks. We give an overview of recent results on structural properties and explicit formulas for generating functions of walks in the quarter plane, with an emphasis on the algorithmic methodology.

Valentin Féray (University of Zurich)   Limit shapes of random pattern-avoiding permutations
Permutation classes are sets of permutations defined by some forbidden subconfigurations. A lot of effort has been put in the past thirty years to enumerate such sets of permutations. More recently, there has been a growing interest into describing large uniform random permutations in a given class. We will in particular focus on an approach via decomposition trees. Analytic combinatorics plays an important role in analyzing such models. This is joint work with Frédérique Bassino (Paris-Nord), Mathilde Bouvel (Zurich), Lucas Gerin (École Polytechnique), Mickaël Maazoun (ENS de Lyon) and Adeline Pierrot (Orsay).

Peter Mörters (University of Cologne)   Metastability of the contact process on evolving scale-free networks
We study the contact process in the regime of small infection rates on scale-free networks evolving by stationary dynamics. A parameter allows us to interpolate between slow (static) and fast (mean-field) network dynamics. For two paradigmatic classes of networks we investigate transitions between phases of fast and slow extinction and in the latter case we analyse the density of infected vertices in the metastable state. The talk is based on joint work with Emmanuel Jacob (ENS Lyon) and Amitai Linker (Universidad de Chile).

Cyril Nicaud (Université Paris-Est, Marne-la-Vallée)   Realistic analysis of algorithms
In the first part of this talk, we give an analysis of the sorting algorithm TimSort, which is implemented in many popular programming languages such as Python, Java, … In the second part, we present some surprising experimental results on the execution time of basic algorithms run on modern processors. We explain these observations by theoretical analysis of the algorithms taking features of modern achitectures into account. 
Though quite different, both parts try to bridge the gap between textbook algorithms and their « real life » implementations: understanding why TimSort is becoming a very popular sorting algorithm, and how we may have to change our way of analyzing algorithms considering the progresses made in computer architecture.

Robin Pemantle (University of Pennsylvania)   Effective algorithms in ACSV
I will begin by reviewing the essentials of analytic combinatorics in several variables (ACSV), highlighting the steps that have up to now required human intervention: finding the dominating point, proving minimality, contour deformation. The rest of the talk will be about effective solutions to these problems. These include rigorizing the Morse theoretic underpinnings of ACSV, using computer algebra to find critical points at infinity, complex variable methods that work for symmetric functions, and some work in progress on a general topological method for smooth denominators. For some audience members, the talk will be uncomfortably topological. On the other hand, I will include a lot of pictures, so the topology can be understood without much formal background.

Dana Randall (Georgia Institute of Technology)   Sampling algorithms and phase transitions
Markov chain Monte Carlo methods have become ubiquitous across science and engineering to model dynamics and explore large combinatorial sets. Over the last 20 years there have been tremendous advances in the design and analysis of efficient sampling algorithms for this purpose. One of the striking discoveries has been the realization that many natural Markov chains undergo phase transitions, whereby they abruptly change from being efficient to inefficient as some parameter of the system is modified. Generating functions can offer an alternative approach to sampling and they play a role in showing when certain Markov chains are efficient or not. We will explore the interplay between Markov chains, generating functions, and phase transitions for a variety of combinatorial problems, including graded posets, Boltzmann sampling, and 3-colorings on Z^2.

Andrea Sportiello (CNRS/ Université Paris 13)   The challenge of linear-time Boltzmann sampling
Let X_N be an ensemble of combinatorial structures of size N, equipped with a measure. Consider the algorithmic problem of exactly sampling from this measure. When this ensemble has a `combinatorial specification’, the celebrated Boltzmann sampling algorithm allows to solve this problem with a complexity which is, typically, of order N^(3/2). Here, a factor N is inherent to the problem, and implied by the Shannon bound on the average number of required random bits, while the extra factor N^(1/2) is due to the fact that Boltzmann sampling produces a random object in the union of the (X_n)’s, whose size is a random variable, valued N only with probability of order N^(-1/2). How much can we improve on this complexity? And in which generality? As you can imagine, in the state of the art there is a trade-off between generality and overall complexity. We will review some recent ideas which explore various directions in this balance.

Lenka Zdeborová (CNRS/ CEA Saclay​)  Optimal algorithms in inference inspired by statistical physics
Analysis of algorithms in noisy high-dimensional probabilistic problems poses many current challenges. In a subclass of these problems the corresponding challenges can be overcome with the help of a method coming from statistical mechanics. I will review some of the related recent work together with progress on rigorous justification of the corresponding results.

SPONSORS