Research

Paper

AI LLM March 12, 2026

Deep Learning Network-Temporal Models For Traffic Prediction

Authors

Yufeng Xin, Ethan Fan

Abstract

Time series analysis is critical for emerging net- work intelligent control and management functions. However, existing statistical-based and shallow machine learning models have shown limited prediction capabilities on multivariate time series. The intricate topological interdependency and complex temporal patterns in network data demand new model approaches. In this paper, based on a systematic multivariate time series model study, we present two deep learning models aiming for learning both temporal patterns and network topological correlations at the same time: a customized network-temporal graph attention network (GAT) model and a fine-tuned multi-modal large language model (LLM) with a clustering overture. Both models are studied against an LSTM model that already outperforms the statistical methods. Through extensive training and performance studies on a real-world network dataset, the LLM-based model demonstrates superior overall prediction and generalization performance, while the GAT model shows its strength in reducing prediction variance across the time series and horizons. More detailed analysis also reveals important insights into correlation variability and prediction distribution discrepancies over time series and different prediction horizons.

Metadata

arXiv ID: 2603.11475
Provider: ARXIV
Primary Category: cs.LG
Published: 2026-03-12
Fetched: 2026-03-14 05:03

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.11475v1</id>\n    <title>Deep Learning Network-Temporal Models For Traffic Prediction</title>\n    <updated>2026-03-12T02:56:19Z</updated>\n    <link href='https://arxiv.org/abs/2603.11475v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.11475v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Time series analysis is critical for emerging net- work intelligent control and management functions. However, existing statistical-based and shallow machine learning models have shown limited prediction capabilities on multivariate time series. The intricate topological interdependency and complex temporal patterns in network data demand new model approaches. In this paper, based on a systematic multivariate time series model study, we present two deep learning models aiming for learning both temporal patterns and network topological correlations at the same time: a customized network-temporal graph attention network (GAT) model and a fine-tuned multi-modal large language model (LLM) with a clustering overture. Both models are studied against an LSTM model that already outperforms the statistical methods. Through extensive training and performance studies on a real-world network dataset, the LLM-based model demonstrates superior overall prediction and generalization performance, while the GAT model shows its strength in reducing prediction variance across the time series and horizons. More detailed analysis also reveals important insights into correlation variability and prediction distribution discrepancies over time series and different prediction horizons.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.LG'/>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.NI'/>\n    <published>2026-03-12T02:56:19Z</published>\n    <arxiv:primary_category term='cs.LG'/>\n    <author>\n      <name>Yufeng Xin</name>\n    </author>\n    <author>\n      <name>Ethan Fan</name>\n    </author>\n  </entry>"
}