Research

Paper

AI LLM March 03, 2026

LaTeX Compilation: Challenges in the Era of LLMs

Authors

Tianyou Liu, Ziqiang Li, Yansong Li, Xurui Liu

Abstract

As large language models (LLMs) increasingly assist scientific writing, limitations and the significant token cost of TeX become more and more visible. This paper analyzes TeX's fundamental defects in compilation and user experience design to illustrate its limitations on compilation efficiency, generated semantics, error localization, and tool ecosystem in the era of LLMs. As an alternative, Mogan STEM, a WYSIWYG structured editor, is introduced. Mogan outperforms TeX in the above aspects by its efficient data structure, fast rendering, and on-demand plugin loading. Extensive experiments are conducted to verify the benefits on compilation/rendering time and performance in LLM tasks. What's more, we show that due to Mogan's lower information entropy, it is more efficient to use .tmu (the document format of Mogan) to fine-tune LLMs than TeX. Therefore, we launch an appeal for larger experiments on LLM training using the .tmu format.

Metadata

arXiv ID: 2603.02873
Provider: ARXIV
Primary Category: cs.CL
Published: 2026-03-03
Fetched: 2026-03-04 03:41

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.02873v1</id>\n    <title>LaTeX Compilation: Challenges in the Era of LLMs</title>\n    <updated>2026-03-03T11:25:47Z</updated>\n    <link href='https://arxiv.org/abs/2603.02873v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.02873v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>As large language models (LLMs) increasingly assist scientific writing, limitations and the significant token cost of TeX become more and more visible. This paper analyzes TeX's fundamental defects in compilation and user experience design to illustrate its limitations on compilation efficiency, generated semantics, error localization, and tool ecosystem in the era of LLMs. As an alternative, Mogan STEM, a WYSIWYG structured editor, is introduced. Mogan outperforms TeX in the above aspects by its efficient data structure, fast rendering, and on-demand plugin loading. Extensive experiments are conducted to verify the benefits on compilation/rendering time and performance in LLM tasks. What's more, we show that due to Mogan's lower information entropy, it is more efficient to use .tmu (the document format of Mogan) to fine-tune LLMs than TeX. Therefore, we launch an appeal for larger experiments on LLM training using the .tmu format.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.CL'/>\n    <published>2026-03-03T11:25:47Z</published>\n    <arxiv:comment>25 pages, 12 figures</arxiv:comment>\n    <arxiv:primary_category term='cs.CL'/>\n    <author>\n      <name>Tianyou Liu</name>\n    </author>\n    <author>\n      <name>Ziqiang Li</name>\n    </author>\n    <author>\n      <name>Yansong Li</name>\n    </author>\n    <author>\n      <name>Xurui Liu</name>\n    </author>\n  </entry>"
}