Research

Paper

TESTING March 18, 2026

PeriphAR: Fast and Accurate Real-World Object Selection with Peripheral Augmented Reality Displays

Authors

Yutong Ren, Arnav Reddy, Michael Nebeling

Abstract

Gaze-based selection in XR requires visual confirmation due to eye-tracking limitations and target ambiguity in 3D contexts. Current designs for wide-FOV displays use world-locked, central overlays, which are not conducive to always-on AR glasses. This paper introduces PeriphAR (per-ree-far), a visualization technique that leverages peripheral vision for feedback during gaze-based selection on a monocular AR display. In a first user study, we isolated text, color, and shape properties of target objects to compare peripheral selection cues. Peripheral vision was more sensitive to color than shape, but this sensitivity rapidly declined at lower contrast. To preserve preattentive processing of color, we developed two strategies to enhance color in users' peripheral vision. In a second user study, our strategy that maximized contrast of the target to the neighboring object with the most similar color was subjectively preferred. As proof of concept, we implemented PeriphAR in an end-to-end system to test performance with real-world object detection.

Metadata

arXiv ID: 2603.18350
Provider: ARXIV
Primary Category: cs.HC
Published: 2026-03-18
Fetched: 2026-03-20 06:02

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.18350v1</id>\n    <title>PeriphAR: Fast and Accurate Real-World Object Selection with Peripheral Augmented Reality Displays</title>\n    <updated>2026-03-18T23:18:36Z</updated>\n    <link href='https://arxiv.org/abs/2603.18350v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.18350v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Gaze-based selection in XR requires visual confirmation due to eye-tracking limitations and target ambiguity in 3D contexts. Current designs for wide-FOV displays use world-locked, central overlays, which are not conducive to always-on AR glasses. This paper introduces PeriphAR (per-ree-far), a visualization technique that leverages peripheral vision for feedback during gaze-based selection on a monocular AR display. In a first user study, we isolated text, color, and shape properties of target objects to compare peripheral selection cues. Peripheral vision was more sensitive to color than shape, but this sensitivity rapidly declined at lower contrast. To preserve preattentive processing of color, we developed two strategies to enhance color in users' peripheral vision. In a second user study, our strategy that maximized contrast of the target to the neighboring object with the most similar color was subjectively preferred. As proof of concept, we implemented PeriphAR in an end-to-end system to test performance with real-world object detection.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.HC'/>\n    <published>2026-03-18T23:18:36Z</published>\n    <arxiv:primary_category term='cs.HC'/>\n    <author>\n      <name>Yutong Ren</name>\n    </author>\n    <author>\n      <name>Arnav Reddy</name>\n    </author>\n    <author>\n      <name>Michael Nebeling</name>\n    </author>\n  </entry>"
}