Lars Onsager Prize: Optimization and learning algorithms from the theory of disordered systems

COFFEE_KLATCH · Invited

Abstract

The extraction of information from large amounts of data is one of the prominent cross disciplinary challenges in contemporary science. Solving inverse and learning problems over large scale data sets requires the design of efficient optimization algorithms over very large scale networks of constraints. In such a setting, critical phenomena of the type studied in statistical physics of disordered systems often play a crucial role. This observation has lead in the last decade to a cross fertilization between statistical physics, information theory and computer science, with applications in a variety of fields. In particular a deeper geometrical understanding of the ground state structure of random computational problems and novel classes of probabilistic algorithms have emerged. In this talk I will give a brief overview of these conceptual advances and I will discuss the role that subdominant states play in the design of algorithms for large scale optimization problems. I will conclude by showing how these ideas can lead to novel applications in computational neuroscience.

Authors

  • Riccardo Zecchina

    Politecnico di Torino