Waddington differentiation and saddle bifurcation in generalized Hopfield networks

ORAL

Abstract

Networks in machine learning offer examples of complex high-dimensional dynamical systems, inspired and reminiscent of what is observed in biology. Here, we study the learning dynamics of Generalized Hopfield networks, which allow for a visualization of internal memories. Those networks are known to proceed through a 'feature-to-prototype' transitions, which has not been fully characterize. We show that, in the 'prototype' learning mode, learning networks exhibit dynamics reminiscent of the well-known Waddington landscape model for cellular differentiation. In particular, dynamics proceeds through sequential 'splits' in memory space. The order of those splits in memory space is interpretable and reproducible, and dynamics between the splits are canalized in the Waddington sense. We then study smaller versions of the system exhibiting similar properties as the full system. We combine analytical calculations with numerical simulations to study the feature-to-prototype transition, and the behaviour of saddles visited during learning. We exhibit regimes where saddles appear and disappear through saddle-node bifurcations, changing the distribution of learnt memories. Memories can thus differentiate in a predictive and controlled way, revealing new bridges between experimental biology, dynamical systems theory, and machine learning.

Publication: Waddington differentiation and saddle bifurcation for prototype learning

Presenters

  • Nacer Eddine Boukacem

    Universite de Montreal

Authors

  • Nacer Eddine Boukacem

    Universite de Montreal

  • Allen Leary

    Regeneron

  • Paul Francois

    Universite de Montreal, Université de Montréal

  • Madhav Mani

    Northwestern University

  • Felix Gottlieb

    McGill University

  • Robin Thériault

    Scuola Normale Superiore di Pisa