Anti-concentration and combinatorics

In probabilistic combinatorics (and probability in general), many arguments are heavily dependent on concentration inequalities (e.g. Chebyshev's inequality or Chernoff bounds), which show that certain random variables are likely to lie in a small interval around their mean. In the other direction, anti-concentration inequalities give upper bounds on the probability that a random variable falls into a small interval or is equal to a particular value.

In this mini-course, we will learn about some of the most important anti-concentration inequalities (in particular, we will focus on the theory surrounding the so-called Littlewood-Offord problem), and we will see a  variety of applications (e.g. to combinatorial random matrix theory, extremal graph theory and the theory of Boolean functions).