Paper
AnkleType: A Hands- and Eyes-free Foot-based Text Entry Technique in Virtual Reality
Authors
Xiyun Luo, Weirong Luo, Kening Zhu, Taizhou Chen
Abstract
Virtual Reality (VR) emphasizes immersive experiences, while text entry often requires hands or visual attention, which may disrupt the interaction flows in VR. We present AnkleType, a hand- and eye-free text-entry technique that leverages ankle-based gestures for both standing and sitting situations. We began with two preliminary studies: one investigated the movement range of users' ankles, and the other elicited user-preferred ankle gestures for text-entry-related operations. The findings of these two studies guided our design of AnkleType. To optimize AnkleType's keyboard layout for eye-free input, we conducted a user study to capture the users' natural ankle spatial awareness with a computer-simulated language test. Through a pairwise comparison study, we designed a bipedal input strategy for sitting (BPSit) and a unipedal input strategy for standing (UPStand). Our first in-VR text-entry evaluation with 16 participants demonstrated that our methods could support the average typing speed from 8.99 WPM (BPSit) to 9.13 WPM (UPStand) for our first-time users. We further evaluated our design with a 7-day longitudinal study with twelve participants. Participants achieved an average typing speed of 15.05 WPM with UPStand and 16.70 WPM with BPSit in the visual condition, and 11.15 WPM and 12.87 WPM, respectively in the eyes-free condition.
Metadata
Related papers
Fractal universe and quantum gravity made simple
Fabio Briscese, Gianluca Calcagni • 2026-03-25
POLY-SIM: Polyglot Speaker Identification with Missing Modality Grand Challenge 2026 Evaluation Plan
Marta Moscati, Muhammad Saad Saeed, Marina Zanoni, Mubashir Noman, Rohan Kuma... • 2026-03-25
LensWalk: Agentic Video Understanding by Planning How You See in Videos
Keliang Li, Yansong Li, Hongze Shen, Mengdi Liu, Hong Chang, Shiguang Shan • 2026-03-25
Orientation Reconstruction of Proteins using Coulomb Explosions
Tomas André, Alfredo Bellisario, Nicusor Timneanu, Carl Caleman • 2026-03-25
The role of spatial context and multitask learning in the detection of organic and conventional farming systems based on Sentinel-2 time series
Jan Hemmerling, Marcel Schwieder, Philippe Rufin, Leon-Friedrich Thomas, Mire... • 2026-03-25
Raw Data (Debug)
{
"raw_xml": "<entry>\n <id>http://arxiv.org/abs/2603.21915v1</id>\n <title>AnkleType: A Hands- and Eyes-free Foot-based Text Entry Technique in Virtual Reality</title>\n <updated>2026-03-23T12:35:59Z</updated>\n <link href='https://arxiv.org/abs/2603.21915v1' rel='alternate' type='text/html'/>\n <link href='https://arxiv.org/pdf/2603.21915v1' rel='related' title='pdf' type='application/pdf'/>\n <summary>Virtual Reality (VR) emphasizes immersive experiences, while text entry often requires hands or visual attention, which may disrupt the interaction flows in VR. We present AnkleType, a hand- and eye-free text-entry technique that leverages ankle-based gestures for both standing and sitting situations. We began with two preliminary studies: one investigated the movement range of users' ankles, and the other elicited user-preferred ankle gestures for text-entry-related operations. The findings of these two studies guided our design of AnkleType. To optimize AnkleType's keyboard layout for eye-free input, we conducted a user study to capture the users' natural ankle spatial awareness with a computer-simulated language test. Through a pairwise comparison study, we designed a bipedal input strategy for sitting (BPSit) and a unipedal input strategy for standing (UPStand). Our first in-VR text-entry evaluation with 16 participants demonstrated that our methods could support the average typing speed from 8.99 WPM (BPSit) to 9.13 WPM (UPStand) for our first-time users. We further evaluated our design with a 7-day longitudinal study with twelve participants. Participants achieved an average typing speed of 15.05 WPM with UPStand and 16.70 WPM with BPSit in the visual condition, and 11.15 WPM and 12.87 WPM, respectively in the eyes-free condition.</summary>\n <category scheme='http://arxiv.org/schemas/atom' term='cs.HC'/>\n <published>2026-03-23T12:35:59Z</published>\n <arxiv:primary_category term='cs.HC'/>\n <arxiv:journal_ref>In Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems</arxiv:journal_ref>\n <author>\n <name>Xiyun Luo</name>\n </author>\n <author>\n <name>Weirong Luo</name>\n </author>\n <author>\n <name>Kening Zhu</name>\n </author>\n <author>\n <name>Taizhou Chen</name>\n </author>\n <arxiv:doi>10.1145/3772318.3790999</arxiv:doi>\n <link href='https://doi.org/10.1145/3772318.3790999' rel='related' title='doi'/>\n </entry>"
}