Probing the Latent Hierarchical Structure of Data via Diffusion Models

High-dimensional data must be highly structured to be learnable. Although the compositional and hierarchical nature of data is often put forward to explain learnability, quantitative measurements establishing these properties are scarce. Likewise, accessing the latent variables underlying such a dat...

Full description

Saved in:
Bibliographic Details
Main Authors Sclocchi, Antonio, Favero, Alessandro, Levi, Noam Itzhak, Wyart, Matthieu
Format Journal Article
LanguageEnglish
Published 17.10.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:High-dimensional data must be highly structured to be learnable. Although the compositional and hierarchical nature of data is often put forward to explain learnability, quantitative measurements establishing these properties are scarce. Likewise, accessing the latent variables underlying such a data structure remains a challenge. In this work, we show that forward-backward experiments in diffusion-based models, where data is noised and then denoised to generate new samples, are a promising tool to probe the latent structure of data. We predict in simple hierarchical models that, in this process, changes in data occur by correlated chunks, with a length scale that diverges at a noise level where a phase transition is known to take place. Remarkably, we confirm this prediction in both text and image datasets using state-of-the-art diffusion models. Our results show how latent variable changes manifest in the data and establish how to measure these effects in real data using diffusion models.
DOI:10.48550/arxiv.2410.13770