Research

Paper

TESTING March 24, 2026

Asymptotic Learning Curves for Diffusion Models with Random Features Score and Manifold Data

Authors

Anand Jerry George, Nicolas Macris

Abstract

We study the theoretical behavior of denoising score matching--the learning task associated to diffusion models--when the data distribution is supported on a low-dimensional manifold and the score is parameterized using a random feature neural network. We derive asymptotically exact expressions for the test, train, and score errors in the high-dimensional limit. Our analysis reveals that, for linear manifolds the sample complexity required to learn the score function scales linearly with the intrinsic dimension of the manifold, rather than with the ambient dimension. Perhaps surprisingly, the benefits of low-dimensional structure starts to diminish once we have a non-linear manifold. These results indicate that diffusion models can benefit from structured data; however, the dependence on the specific type of structure is subtle and intricate.

Metadata

arXiv ID: 2603.22962
Provider: ARXIV
Primary Category: cs.LG
Published: 2026-03-24
Fetched: 2026-03-25 06:02

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.22962v1</id>\n    <title>Asymptotic Learning Curves for Diffusion Models with Random Features Score and Manifold Data</title>\n    <updated>2026-03-24T08:58:12Z</updated>\n    <link href='https://arxiv.org/abs/2603.22962v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.22962v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>We study the theoretical behavior of denoising score matching--the learning task associated to diffusion models--when the data distribution is supported on a low-dimensional manifold and the score is parameterized using a random feature neural network. We derive asymptotically exact expressions for the test, train, and score errors in the high-dimensional limit. Our analysis reveals that, for linear manifolds the sample complexity required to learn the score function scales linearly with the intrinsic dimension of the manifold, rather than with the ambient dimension. Perhaps surprisingly, the benefits of low-dimensional structure starts to diminish once we have a non-linear manifold. These results indicate that diffusion models can benefit from structured data; however, the dependence on the specific type of structure is subtle and intricate.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.LG'/>\n    <category scheme='http://arxiv.org/schemas/atom' term='stat.ML'/>\n    <published>2026-03-24T08:58:12Z</published>\n    <arxiv:comment>12 pages</arxiv:comment>\n    <arxiv:primary_category term='cs.LG'/>\n    <author>\n      <name>Anand Jerry George</name>\n    </author>\n    <author>\n      <name>Nicolas Macris</name>\n    </author>\n  </entry>"
}