Planned seminars

Europe/Lisbon —

Constantino Tsallis

Constantino Tsallis, Group of Statistical Physics, CBPF and Santa Fe Institute

Together with Newtonian mechanics, Maxwell electromagnetism, Einstein relativity and quantum mechanics, Boltzmann-Gibbs (BG) statistical mechanics constitutes one of the pillars of contemporary theoretical physics, with uncountable applications in science and technology. This theory applies formidably well to a plethora of physical systems. Still, it fails in the realm of complex systems, characterized by generically strong space-time entanglement of their elements. On the basis of a nonadditive entropy (defined by an index $q$, which recovers, for $q=1$, the celebrated Boltzmann-Gibbs-von Neumann-Shannon entropy), it is possible to generalize the BG theory. We will briefly review the foundations and applications in natural, artificial and social systems.

A Bibliography is available at

Europe/Lisbon —

Michael Arbel

Michael Arbel, Gatsby Computational Neuroscience Unit, University College London

Annealed Importance Sampling (AIS) and its Sequential Monte Carlo (SMC) extensions are state-of-the-art methods for estimating normalizing constants of probability distributions. We propose here a novel Monte Carlo algorithm, Annealed Flow Transport (AFT), that builds upon AIS and SMC and combines them with normalizing flows (NF) for improved performance. This method transports a set of particles using not only importance sampling (IS), Markov chain Monte Carlo (MCMC) and resampling steps - as in SMC, but also relies on NF which are learned sequentially to push particles towards the successive annealed targets. We provide limit theorems for the resulting Monte Carlo estimates of the normalizing constant and expectations with respect to the target distribution. Additionally, we show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure which simplifies to the law of a controlled diffusion for expressive NF. We demonstrate experimentally the benefits and limitations of our methodology on a variety of applications.

Europe/Lisbon —

Soledad Villar

Soledad Villar, Mathematical Institute for Data Science at Johns Hopkins University

There has been enormous progress in the last few years in designing neural networks that respect the fundamental symmetries and coordinate freedoms of physical law. Some of these frameworks make use of irreducible representations, some make use of high-order tensor objects, and some apply symmetry-enforcing constraints. Different physical laws obey different combinations of fundamental symmetries, but a large fraction (possibly all) of classical physics is equivariant to translation, rotation, reflection (parity), boost (relativity), and permutations. Here we show that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincare groups, at any dimensionality d. The key observation is that nonlinear O(d)-equivariant (and related-group-equivariant) functions can be expressed in terms of a lightweight collection of scalars---scalar products and scalar contractions of the scalar, vector, and tensor inputs. These results demonstrate theoretically that gauge-invariant deep learning models for classical physics with good scaling for large problems are feasible right now.