Paper
Hardness of High-Dimensional Linear Classification
Authors
Alexander Munteanu, Simon Omlor, Jeff M. Phillips
Abstract
We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only $O(n^d)$ and respectively $\tilde O(1/\varepsilon^d)$ upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and $k$-Sum problems. Our reductions yield matching lower bounds of $\tildeΩ(n^d)$ and respectively $\tildeΩ(1/\varepsilon^d)$ based on Affine Degeneracy testing, and $\tildeΩ(n^{d/2})$ and respectively $\tildeΩ(1/\varepsilon^{d/2})$ conditioned on $k$-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.
Metadata
Related papers
Fractal universe and quantum gravity made simple
Fabio Briscese, Gianluca Calcagni • 2026-03-25
POLY-SIM: Polyglot Speaker Identification with Missing Modality Grand Challenge 2026 Evaluation Plan
Marta Moscati, Muhammad Saad Saeed, Marina Zanoni, Mubashir Noman, Rohan Kuma... • 2026-03-25
LensWalk: Agentic Video Understanding by Planning How You See in Videos
Keliang Li, Yansong Li, Hongze Shen, Mengdi Liu, Hong Chang, Shiguang Shan • 2026-03-25
Orientation Reconstruction of Proteins using Coulomb Explosions
Tomas André, Alfredo Bellisario, Nicusor Timneanu, Carl Caleman • 2026-03-25
The role of spatial context and multitask learning in the detection of organic and conventional farming systems based on Sentinel-2 time series
Jan Hemmerling, Marcel Schwieder, Philippe Rufin, Leon-Friedrich Thomas, Mire... • 2026-03-25
Raw Data (Debug)
{
"raw_xml": "<entry>\n <id>http://arxiv.org/abs/2603.19061v1</id>\n <title>Hardness of High-Dimensional Linear Classification</title>\n <updated>2026-03-19T15:53:41Z</updated>\n <link href='https://arxiv.org/abs/2603.19061v1' rel='alternate' type='text/html'/>\n <link href='https://arxiv.org/pdf/2603.19061v1' rel='related' title='pdf' type='application/pdf'/>\n <summary>We establish new exponential in dimension lower bounds for the Maximum Halfspace Discrepancy problem, which models linear classification. Both are fundamental problems in computational geometry and machine learning in their exact and approximate forms. However, only $O(n^d)$ and respectively $\\tilde O(1/\\varepsilon^d)$ upper bounds are known and complemented by polynomial lower bounds that do not support the exponential in dimension dependence. We close this gap up to polylogarithmic terms by reduction from widely-believed hardness conjectures for Affine Degeneracy testing and $k$-Sum problems. Our reductions yield matching lower bounds of $\\tildeΩ(n^d)$ and respectively $\\tildeΩ(1/\\varepsilon^d)$ based on Affine Degeneracy testing, and $\\tildeΩ(n^{d/2})$ and respectively $\\tildeΩ(1/\\varepsilon^{d/2})$ conditioned on $k$-Sum. The first bound also holds unconditionally if the computational model is restricted to make sidedness queries, which corresponds to a widely spread setting implemented and optimized in many contemporary algorithms and computing paradigms.</summary>\n <category scheme='http://arxiv.org/schemas/atom' term='cs.CG'/>\n <category scheme='http://arxiv.org/schemas/atom' term='cs.DS'/>\n <category scheme='http://arxiv.org/schemas/atom' term='cs.LG'/>\n <category scheme='http://arxiv.org/schemas/atom' term='stat.ML'/>\n <published>2026-03-19T15:53:41Z</published>\n <arxiv:comment>SoCG 2026</arxiv:comment>\n <arxiv:primary_category term='cs.CG'/>\n <author>\n <name>Alexander Munteanu</name>\n </author>\n <author>\n <name>Simon Omlor</name>\n </author>\n <author>\n <name>Jeff M. Phillips</name>\n </author>\n </entry>"
}