Research

Paper

TESTING March 24, 2026

Beyond Explanation: Evidentiary Rights for Algorithmic Accountability

Authors

Matthew Stewart

Abstract

Algorithmic accountability scholarship has focused heavily on explanation, helping affected parties understand why decisions were made. We argue this focus is insufficient. Explanation without evidentiary access does not enable meaningful contestation. A person told "your risk score was 0.73" understands the decision but cannot verify the score, test alternatives, or produce counter-evidence. We introduce a taxonomy of contestation failures, showing that most accountability interventions address only one failure mode (opacity) while leaving four others unaddressed. Drawing on analysis of 168 legal cases spanning algorithmic decision-making contexts, we find that contestation faces a two-gate structure: a procedural gate (evidentiary access) and a doctrinal gate (substantive liability rules). Among litigated cases, those without evidence access almost never succeed (9%); those with access succeed at rates approaching 97% in domains without liability shields. Where doctrinal immunities apply (e.g., Section 230), even full evidentiary scrutiny produces no liability. This association almost certainly reflects selection effects; our empirical contribution is diagnostic rather than causal. The data identify where contestation fails among observable cases, not whether providing access would change outcomes for currently-excluded cases. We propose evidentiary rights as the missing procedural component, and develop counterfactual interrogation rights that allow affected parties to probe decision systems with modified inputs and observe whether outcomes change, without requiring disclosure of model internals. This reframes algorithmic accountability from a transparency problem to a procedural rights problem.

Metadata

arXiv ID: 2603.22716
Provider: ARXIV
Primary Category: cs.CY
Published: 2026-03-24
Fetched: 2026-03-25 06:02

Related papers

Raw Data (Debug)
{
  "raw_xml": "<entry>\n    <id>http://arxiv.org/abs/2603.22716v1</id>\n    <title>Beyond Explanation: Evidentiary Rights for Algorithmic Accountability</title>\n    <updated>2026-03-24T02:12:20Z</updated>\n    <link href='https://arxiv.org/abs/2603.22716v1' rel='alternate' type='text/html'/>\n    <link href='https://arxiv.org/pdf/2603.22716v1' rel='related' title='pdf' type='application/pdf'/>\n    <summary>Algorithmic accountability scholarship has focused heavily on explanation, helping affected parties understand why decisions were made. We argue this focus is insufficient. Explanation without evidentiary access does not enable meaningful contestation. A person told \"your risk score was 0.73\" understands the decision but cannot verify the score, test alternatives, or produce counter-evidence. We introduce a taxonomy of contestation failures, showing that most accountability interventions address only one failure mode (opacity) while leaving four others unaddressed. Drawing on analysis of 168 legal cases spanning algorithmic decision-making contexts, we find that contestation faces a two-gate structure: a procedural gate (evidentiary access) and a doctrinal gate (substantive liability rules). Among litigated cases, those without evidence access almost never succeed (9%); those with access succeed at rates approaching 97% in domains without liability shields. Where doctrinal immunities apply (e.g., Section 230), even full evidentiary scrutiny produces no liability. This association almost certainly reflects selection effects; our empirical contribution is diagnostic rather than causal. The data identify where contestation fails among observable cases, not whether providing access would change outcomes for currently-excluded cases. We propose evidentiary rights as the missing procedural component, and develop counterfactual interrogation rights that allow affected parties to probe decision systems with modified inputs and observe whether outcomes change, without requiring disclosure of model internals. This reframes algorithmic accountability from a transparency problem to a procedural rights problem.</summary>\n    <category scheme='http://arxiv.org/schemas/atom' term='cs.CY'/>\n    <published>2026-03-24T02:12:20Z</published>\n    <arxiv:comment>Accepted at ACM FAccT 2026. Dataset available at https://doi.org/10.5281/zenodo.18069759</arxiv:comment>\n    <arxiv:primary_category term='cs.CY'/>\n    <author>\n      <name>Matthew Stewart</name>\n    </author>\n  </entry>"
}