Paper
How Proxy Race Distorts Regression-Based Fairness Audits
Authors
Xi Xin, Giles Hooker, Fei Huang
Abstract
Proxy-based race inference is increasingly used to conduct fairness assessments when protected-class data are unavailable or legally restricted -- most prominently in U.S. fair-lending enforcement, and now explicitly contemplated in emerging insurance regulation, including Colorado's draft SB21-169 testing framework and New York's Insurance Circular Letter No. 7. Despite this growing regulatory relevance, little is known about how standard regression-based discrimination analyses behave when race is measured with error through proxies such as Bayesian Improved Surname Geocoding (BISG) or Bayesian Improved First Name and Surname Geocoding (BIFSG). This paper studies the consequences of using proxy-imputed race as a categorical regressor in regression-based fairness assessments. Treating proxy race as a categorical covariate subject to misclassification, we show that proxy-based coefficients become weighted mixtures of true group effects, systematically shrinking estimated disparities toward the majority group -- even when overall classification accuracy is high. Empirically, using a linked North Carolina voter-insurance dataset with self-reported race and ZIP-level auto insurance premiums, we demonstrate two mechanisms through which it distorts inference: (i) the intrinsic mixing of group effects implied by misclassification, and (ii) structured errors that vary with ZIP-level racial composition and socioeconomic conditions and remain correlated with pricing residuals after controls. As a result, regression-based disparity estimates can be attenuated or amplified relative to analogous analyses based on self-reported race. Our findings caution against treating proxy race as a plug-in substitute in regulatory testing and highlight design implications for proxy-based audit frameworks in insurance and other high-stakes domains.
Metadata
Related papers
Fractal universe and quantum gravity made simple
Fabio Briscese, Gianluca Calcagni • 2026-03-25
POLY-SIM: Polyglot Speaker Identification with Missing Modality Grand Challenge 2026 Evaluation Plan
Marta Moscati, Muhammad Saad Saeed, Marina Zanoni, Mubashir Noman, Rohan Kuma... • 2026-03-25
LensWalk: Agentic Video Understanding by Planning How You See in Videos
Keliang Li, Yansong Li, Hongze Shen, Mengdi Liu, Hong Chang, Shiguang Shan • 2026-03-25
Orientation Reconstruction of Proteins using Coulomb Explosions
Tomas André, Alfredo Bellisario, Nicusor Timneanu, Carl Caleman • 2026-03-25
The role of spatial context and multitask learning in the detection of organic and conventional farming systems based on Sentinel-2 time series
Jan Hemmerling, Marcel Schwieder, Philippe Rufin, Leon-Friedrich Thomas, Mire... • 2026-03-25
Raw Data (Debug)
{
"raw_xml": "<entry>\n <id>http://arxiv.org/abs/2603.17106v1</id>\n <title>How Proxy Race Distorts Regression-Based Fairness Audits</title>\n <updated>2026-03-17T19:57:29Z</updated>\n <link href='https://arxiv.org/abs/2603.17106v1' rel='alternate' type='text/html'/>\n <link href='https://arxiv.org/pdf/2603.17106v1' rel='related' title='pdf' type='application/pdf'/>\n <summary>Proxy-based race inference is increasingly used to conduct fairness assessments when protected-class data are unavailable or legally restricted -- most prominently in U.S. fair-lending enforcement, and now explicitly contemplated in emerging insurance regulation, including Colorado's draft SB21-169 testing framework and New York's Insurance Circular Letter No. 7. Despite this growing regulatory relevance, little is known about how standard regression-based discrimination analyses behave when race is measured with error through proxies such as Bayesian Improved Surname Geocoding (BISG) or Bayesian Improved First Name and Surname Geocoding (BIFSG). This paper studies the consequences of using proxy-imputed race as a categorical regressor in regression-based fairness assessments. Treating proxy race as a categorical covariate subject to misclassification, we show that proxy-based coefficients become weighted mixtures of true group effects, systematically shrinking estimated disparities toward the majority group -- even when overall classification accuracy is high. Empirically, using a linked North Carolina voter-insurance dataset with self-reported race and ZIP-level auto insurance premiums, we demonstrate two mechanisms through which it distorts inference: (i) the intrinsic mixing of group effects implied by misclassification, and (ii) structured errors that vary with ZIP-level racial composition and socioeconomic conditions and remain correlated with pricing residuals after controls. As a result, regression-based disparity estimates can be attenuated or amplified relative to analogous analyses based on self-reported race. Our findings caution against treating proxy race as a plug-in substitute in regulatory testing and highlight design implications for proxy-based audit frameworks in insurance and other high-stakes domains.</summary>\n <category scheme='http://arxiv.org/schemas/atom' term='stat.AP'/>\n <published>2026-03-17T19:57:29Z</published>\n <arxiv:primary_category term='stat.AP'/>\n <author>\n <name>Xi Xin</name>\n </author>\n <author>\n <name>Giles Hooker</name>\n </author>\n <author>\n <name>Fei Huang</name>\n </author>\n </entry>"
}