Research

Paper

TESTING February 23, 2026

Improving Generalization and Trainability of Quantum Eigensolvers via Graph Neural Encoding

Authors

Jungyun Lee, Daniel K. Park

Abstract

Determining the ground state of a many-body Hamiltonian is a central problem across physics, chemistry, and combinatorial optimization, yet it is often classically intractable due to the exponential growth of Hilbert space with system size. Even on fault-tolerant quantum computers, quantum algorithms with convergence guarantees -- such as quantum phase estimation and quantum subspace methods -- require an initial state with sufficiently large overlap with the true ground state to be effective. Variational quantum eigensolvers (VQEs) are natural candidates for preparing such states; however, standard VQEs typically exhibit poor generalization, requiring retraining for each Hamiltonian instance, and often suffer from barren plateaus, where gradients can vanish exponentially with circuit depth and system size. To address these limitations, we propose an end-to-end representation learning framework that combines a graph autoencoder with a classical neural network to generate VQE parameters that generalize across Hamiltonian instances. By encoding interaction topology and coupling structure, the proposed model produces high-overlap initial states without instance-specific optimization. Through extensive numerical experiments on families of one- and two-local Hamiltonians, we demonstrate improved generalization and trainability, manifested as reduced test error and a significantly milder decay of gradient variance. We further show that our method substantially accelerates convergence in quantum subspace-based eigensolvers, highlighting its practical impact for downstream quantum algorithms.

Metadata

arXiv ID: 2602.19752
Provider: ARXIV
Primary Category: quant-ph
Published: 2026-02-23
Fetched: 2026-02-24 04:38

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2602.19752v1</id>\n    <title>Improving Generalization and Trainability of Quantum Eigensolvers via Graph Neural Encoding</title>\n    <updated>2026-02-23T12:01:14Z</updated>\n    <link href='https://arxiv.org/abs/2602.19752v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2602.19752v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Determining the ground state of a many-body Hamiltonian is a central problem across physics, chemistry, and combinatorial optimization, yet it is often classically intractable due to the exponential growth of Hilbert space with system size. Even on fault-tolerant quantum computers, quantum algorithms with convergence guarantees -- such as quantum phase estimation and quantum subspace methods -- require an initial state with sufficiently large overlap with the true ground state to be effective. Variational quantum eigensolvers (VQEs) are natural candidates for preparing such states; however, standard VQEs typically exhibit poor generalization, requiring retraining for each Hamiltonian instance, and often suffer from barren plateaus, where gradients can vanish exponentially with circuit depth and system size. To address these limitations, we propose an end-to-end representation learning framework that combines a graph autoencoder with a classical neural network to generate VQE parameters that generalize across Hamiltonian instances. By encoding interaction topology and coupling structure, the proposed model produces high-overlap initial states without instance-specific optimization. Through extensive numerical experiments on families of one- and two-local Hamiltonians, we demonstrate improved generalization and trainability, manifested as reduced test error and a significantly milder decay of gradient variance. We further show that our method substantially accelerates convergence in quantum subspace-based eigensolvers, highlighting its practical impact for downstream quantum algorithms.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='quant-ph'/>\n    <published>2026-02-23T12:01:14Z</published>\n    <arxiv:primary_category term='quant-ph'/>\n    <author>\n      <name>Jungyun Lee</name>\n    </author>\n    <author>\n      <name>Daniel K. Park</name>\n    </author>\n  </entry>"
}