Research

Paper

TESTING February 27, 2026

OmniTrack: General Motion Tracking via Physics-Consistent Reference

Authors

Yuhan Li, Peiyuan Zhi, Yunshen Wang, Tengyu Liu, Sixu Yan, Wenyu Liu, Xinggang Wang, Baoxiong Jia, Siyuan Huang

Abstract

Learning motion tracking from rich human motion data is a foundational task for achieving general control in humanoid robots, enabling them to perform diverse behaviors. However, discrepancies in morphology and dynamics between humans and robots, combined with data noise, introduce physically infeasible artifacts in reference motions, such as floating and penetration. During both training and execution, these artifacts create a conflict between following inaccurate reference motions and maintaining the robot's stability, hindering the development of a generalizable motion tracking policy. To address these challenges, we introduce OmniTrack, a general tracking framework that explicitly decouples physical feasibility from general motion tracking. In the first stage, a privileged generalist policy generates physically plausible motions that strictly adhere to the robot's dynamics via trajectory rollout in simulation. In the second stage, the general control policy is trained to track these physically feasible motions, ensuring stable and coherent control transfer to the real robot. Experiments show that OmniTrack improves tracking accuracy and demonstrates strong generalization to unseen motions. In real-world tests, OmniTrack achieves hour-long, consistent, and stable tracking, including complex acrobatic motions such as flips and cartwheels. Additionally, we show that OmniTrack supports human-style stable and dynamic online teleoperation, highlighting its robustness and adaptability to varying user inputs.

Metadata

arXiv ID: 2602.23832
Provider: ARXIV
Primary Category: cs.RO
Published: 2026-02-27
Fetched: 2026-03-02 06:04

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2602.23832v1</id>\n    <title>OmniTrack: General Motion Tracking via Physics-Consistent Reference</title>\n    <updated>2026-02-27T09:12:15Z</updated>\n    <link href='https://arxiv.org/abs/2602.23832v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2602.23832v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Learning motion tracking from rich human motion data is a foundational task for achieving general control in humanoid robots, enabling them to perform diverse behaviors. However, discrepancies in morphology and dynamics between humans and robots, combined with data noise, introduce physically infeasible artifacts in reference motions, such as floating and penetration. During both training and execution, these artifacts create a conflict between following inaccurate reference motions and maintaining the robot's stability, hindering the development of a generalizable motion tracking policy. To address these challenges, we introduce OmniTrack, a general tracking framework that explicitly decouples physical feasibility from general motion tracking. In the first stage, a privileged generalist policy generates physically plausible motions that strictly adhere to the robot's dynamics via trajectory rollout in simulation. In the second stage, the general control policy is trained to track these physically feasible motions, ensuring stable and coherent control transfer to the real robot. Experiments show that OmniTrack improves tracking accuracy and demonstrates strong generalization to unseen motions. In real-world tests, OmniTrack achieves hour-long, consistent, and stable tracking, including complex acrobatic motions such as flips and cartwheels. Additionally, we show that OmniTrack supports human-style stable and dynamic online teleoperation, highlighting its robustness and adaptability to varying user inputs.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.RO'/>\n    <published>2026-02-27T09:12:15Z</published>\n    <arxiv:comment>website: https://omnitrack-humanoid.github.io/</arxiv:comment>\n    <arxiv:primary_category term='cs.RO'/>\n    <author>\n      <name>Yuhan Li</name>\n    </author>\n    <author>\n      <name>Peiyuan Zhi</name>\n    </author>\n    <author>\n      <name>Yunshen Wang</name>\n    </author>\n    <author>\n      <name>Tengyu Liu</name>\n    </author>\n    <author>\n      <name>Sixu Yan</name>\n    </author>\n    <author>\n      <name>Wenyu Liu</name>\n    </author>\n    <author>\n      <name>Xinggang Wang</name>\n    </author>\n    <author>\n      <name>Baoxiong Jia</name>\n    </author>\n    <author>\n      <name>Siyuan Huang</name>\n    </author>\n  </entry>"
}