Research

Paper

TESTING March 10, 2026

Variational Routing: A Scalable Bayesian Framework for Calibrated Mixture-of-Experts Transformers

Authors

Albus Yizhuo Li, Matthew Wicker

Abstract

Foundation models are increasingly being deployed in contexts where understanding the uncertainty of their outputs is critical to ensuring responsible deployment. While Bayesian methods offer a principled approach to uncertainty quantification, their computational overhead renders their use impractical for training or inference at foundation model scale. State-of-the-art models achieve parameter counts in the trillions through carefully engineered sparsity including Mixture-of-Experts (MoE) layers. In this work, we demonstrate calibrated uncertainty at scale by introducing Variational Mixture-of-Experts Routing (VMoER), a structured Bayesian approach for modelling uncertainty in MoE layers. VMoER confines Bayesian inference to the expert-selection stage which is typically done by a deterministic routing network. We instantiate VMoER using two inference strategies: amortised variational inference over routing logits and inferring a temperature parameter for stochastic expert selection. Across tested foundation models, VMoER improves routing stability under noise by 38\%, reduces calibration error by 94\%, and increases out-of-distribution AUROC by 12\%, while incurring less than 1\% additional FLOPs. These results suggest VMoER offers a scalable path toward robust and uncertainty-aware foundation models.

Metadata

arXiv ID: 2603.09453
Provider: ARXIV
Primary Category: cs.LG
Published: 2026-03-10
Fetched: 2026-03-11 06:02

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.09453v1</id>\n    <title>Variational Routing: A Scalable Bayesian Framework for Calibrated Mixture-of-Experts Transformers</title>\n    <updated>2026-03-10T10:07:53Z</updated>\n    <link href='https://arxiv.org/abs/2603.09453v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.09453v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Foundation models are increasingly being deployed in contexts where understanding the uncertainty of their outputs is critical to ensuring responsible deployment. While Bayesian methods offer a principled approach to uncertainty quantification, their computational overhead renders their use impractical for training or inference at foundation model scale. State-of-the-art models achieve parameter counts in the trillions through carefully engineered sparsity including Mixture-of-Experts (MoE) layers. In this work, we demonstrate calibrated uncertainty at scale by introducing Variational Mixture-of-Experts Routing (VMoER), a structured Bayesian approach for modelling uncertainty in MoE layers. VMoER confines Bayesian inference to the expert-selection stage which is typically done by a deterministic routing network. We instantiate VMoER using two inference strategies: amortised variational inference over routing logits and inferring a temperature parameter for stochastic expert selection. Across tested foundation models, VMoER improves routing stability under noise by 38\\%, reduces calibration error by 94\\%, and increases out-of-distribution AUROC by 12\\%, while incurring less than 1\\% additional FLOPs. These results suggest VMoER offers a scalable path toward robust and uncertainty-aware foundation models.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.LG'/>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.AI'/>\n    <category scheme='http://arxiv.org/schemas/atom' term='stat.ML'/>\n    <published>2026-03-10T10:07:53Z</published>\n    <arxiv:comment>8 pages, 7 figures for main text; 16 pages for Appendix; In submission to ICML 2026;</arxiv:comment>\n    <arxiv:primary_category term='cs.LG'/>\n    <author>\n      <name>Albus Yizhuo Li</name>\n    </author>\n    <author>\n      <name>Matthew Wicker</name>\n    </author>\n  </entry>"
}