Research

Paper

AI LLM February 25, 2026

Trie-Aware Transformers for Generative Recommendation

Authors

Zhenxiang Xu, Jiawei Chen, Sirui Chen, Yong He, Jieyu Yang, Chuan Yuan, Ke Ding, Can Wang

Abstract

Generative recommendation (GR) aligns with advances in generative AI by casting next-item prediction as token-level generation rather than score-based ranking. Most GR methods adopt a two-stage pipeline: (i) \textit{item tokenization}, which maps each item to a sequence of discrete, hierarchically organized tokens; and (ii) \textit{autoregressive generation}, which predicts the next item's tokens conditioned on the tokens of user's interaction history. Although hierarchical tokenization induces a prefix tree (trie) over items, standard autoregressive modeling with conventional Transformers often flattens item tokens into a linear stream and overlooks the underlying topology. To address this, we propose TrieRec, a trie-aware generative recommendation method that augments Transformers with structural inductive biases via two positional encodings. First, a \textit{trie-aware absolute positional encoding} aggregates a token's (node's) local structural context (\eg depth, ancestors, and descendants) into the token representation. Second, a \textit{topology-aware relative positional encoding} injects pairwise structural relations into self-attention to capture topology-induced semantic relatedness. TrieRec is also model-agnostic, efficient, and hyperparameter-free. In our experiments, we implement TrieRec within three representative GR backbones, achieving notably improvements of 8.83\% on average across four real-world datasets.

Metadata

arXiv ID: 2602.21677
Provider: ARXIV
Primary Category: cs.IR
Published: 2026-02-25
Fetched: 2026-02-26 05:00

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2602.21677v1</id>\n    <title>Trie-Aware Transformers for Generative Recommendation</title>\n    <updated>2026-02-25T08:25:16Z</updated>\n    <link href='https://arxiv.org/abs/2602.21677v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2602.21677v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Generative recommendation (GR) aligns with advances in generative AI by casting next-item prediction as token-level generation rather than score-based ranking. Most GR methods adopt a two-stage pipeline: (i) \\textit{item tokenization}, which maps each item to a sequence of discrete, hierarchically organized tokens; and (ii) \\textit{autoregressive generation}, which predicts the next item's tokens conditioned on the tokens of user's interaction history. Although hierarchical tokenization induces a prefix tree (trie) over items, standard autoregressive modeling with conventional Transformers often flattens item tokens into a linear stream and overlooks the underlying topology.\n  To address this, we propose TrieRec, a trie-aware generative recommendation method that augments Transformers with structural inductive biases via two positional encodings. First, a \\textit{trie-aware absolute positional encoding} aggregates a token's (node's) local structural context (\\eg depth, ancestors, and descendants) into the token representation. Second, a \\textit{topology-aware relative positional encoding} injects pairwise structural relations into self-attention to capture topology-induced semantic relatedness. TrieRec is also model-agnostic, efficient, and hyperparameter-free. In our experiments, we implement TrieRec within three representative GR backbones, achieving notably improvements of 8.83\\% on average across four real-world datasets.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.IR'/>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.LG'/>\n    <published>2026-02-25T08:25:16Z</published>\n    <arxiv:primary_category term='cs.IR'/>\n    <author>\n      <name>Zhenxiang Xu</name>\n    </author>\n    <author>\n      <name>Jiawei Chen</name>\n    </author>\n    <author>\n      <name>Sirui Chen</name>\n    </author>\n    <author>\n      <name>Yong He</name>\n    </author>\n    <author>\n      <name>Jieyu Yang</name>\n    </author>\n    <author>\n      <name>Chuan Yuan</name>\n    </author>\n    <author>\n      <name>Ke Ding</name>\n    </author>\n    <author>\n      <name>Can Wang</name>\n    </author>\n  </entry>"
}