Paper
Tiny Aya: Bridging Scale and Multilingual Depth
Authors
Alejandro R. Salamanca, Diana Abagyan, Daniel D'souza, Ammar Khairi, David Mora, Saurabh Dash, Viraat Aryabumi, Sara Rajaee, Mehrnaz Mofakhami, Ananya Sahu, Thomas Euyang, Brittawnya Prince, Madeline Smith, Hangyu Lin, Acyr Locatelli, Sara Hooker, Tom Kocmi, Aidan Gomez, Ivan Zhang, Phil Blunsom, Nick Frosst, Joelle Pineau, Beyza Ermis, Ahmet Üstün, Julia Kreutzer, Marzieh Fadaee
Abstract
Tiny Aya redefines what a small multilingual language model can achieve. Trained on 70 languages and refined through region-aware posttraining, it delivers state-of-the-art in translation quality, strong multilingual understanding, and high-quality target-language generation, all with just 3.35B parameters. The release includes a pretrained foundation model, a globally balanced instruction-tuned variant, and three region-specialized models targeting languages from Africa, South Asia, Europe, Asia-Pacific, and West Asia. This report details the training strategy, data composition, and comprehensive evaluation framework behind Tiny Aya, and presents an alternative scaling path for multilingual AI: one centered on efficiency, balanced performance across languages, and practical deployment.
Metadata
Related papers
Gen-Searcher: Reinforcing Agentic Search for Image Generation
Kaituo Feng, Manyuan Zhang, Shuang Chen, Yunlong Lin, Kaixuan Fan, Yilei Jian... • 2026-03-30
On-the-fly Repulsion in the Contextual Space for Rich Diversity in Diffusion Transformers
Omer Dahary, Benaya Koren, Daniel Garibi, Daniel Cohen-Or • 2026-03-30
Graphilosophy: Graph-Based Digital Humanities Computing with The Four Books
Minh-Thu Do, Quynh-Chau Le-Tran, Duc-Duy Nguyen-Mai, Thien-Trang Nguyen, Khan... • 2026-03-30
ParaSpeechCLAP: A Dual-Encoder Speech-Text Model for Rich Stylistic Language-Audio Pretraining
Anuj Diwan, Eunsol Choi, David Harwath • 2026-03-30
RAD-AI: Rethinking Architecture Documentation for AI-Augmented Ecosystems
Oliver Aleksander Larsen, Mahyar T. Moghaddam • 2026-03-30
Raw Data (Debug)
{
"raw_xml": "<entry>\n <id>http://arxiv.org/abs/2603.11510v1</id>\n <title>Tiny Aya: Bridging Scale and Multilingual Depth</title>\n <updated>2026-03-12T03:53:13Z</updated>\n <link href='https://arxiv.org/abs/2603.11510v1' rel='alternate' type='text/html'/>\n <link href='https://arxiv.org/pdf/2603.11510v1' rel='related' title='pdf' type='application/pdf'/>\n <summary>Tiny Aya redefines what a small multilingual language model can achieve. Trained on 70 languages and refined through region-aware posttraining, it delivers state-of-the-art in translation quality, strong multilingual understanding, and high-quality target-language generation, all with just 3.35B parameters. The release includes a pretrained foundation model, a globally balanced instruction-tuned variant, and three region-specialized models targeting languages from Africa, South Asia, Europe, Asia-Pacific, and West Asia. This report details the training strategy, data composition, and comprehensive evaluation framework behind Tiny Aya, and presents an alternative scaling path for multilingual AI: one centered on efficiency, balanced performance across languages, and practical deployment.</summary>\n <category scheme='http://arxiv.org/schemas/atom' term='cs.CL'/>\n <published>2026-03-12T03:53:13Z</published>\n <arxiv:primary_category term='cs.CL'/>\n <author>\n <name>Alejandro R. Salamanca</name>\n </author>\n <author>\n <name>Diana Abagyan</name>\n </author>\n <author>\n <name>Daniel D'souza</name>\n </author>\n <author>\n <name>Ammar Khairi</name>\n </author>\n <author>\n <name>David Mora</name>\n </author>\n <author>\n <name>Saurabh Dash</name>\n </author>\n <author>\n <name>Viraat Aryabumi</name>\n </author>\n <author>\n <name>Sara Rajaee</name>\n </author>\n <author>\n <name>Mehrnaz Mofakhami</name>\n </author>\n <author>\n <name>Ananya Sahu</name>\n </author>\n <author>\n <name>Thomas Euyang</name>\n </author>\n <author>\n <name>Brittawnya Prince</name>\n </author>\n <author>\n <name>Madeline Smith</name>\n </author>\n <author>\n <name>Hangyu Lin</name>\n </author>\n <author>\n <name>Acyr Locatelli</name>\n </author>\n <author>\n <name>Sara Hooker</name>\n </author>\n <author>\n <name>Tom Kocmi</name>\n </author>\n <author>\n <name>Aidan Gomez</name>\n </author>\n <author>\n <name>Ivan Zhang</name>\n </author>\n <author>\n <name>Phil Blunsom</name>\n </author>\n <author>\n <name>Nick Frosst</name>\n </author>\n <author>\n <name>Joelle Pineau</name>\n </author>\n <author>\n <name>Beyza Ermis</name>\n </author>\n <author>\n <name>Ahmet Üstün</name>\n </author>\n <author>\n <name>Julia Kreutzer</name>\n </author>\n <author>\n <name>Marzieh Fadaee</name>\n </author>\n </entry>"
}