Paper
vPET-ABC: Fast Voxelwise Approximate Bayesian Inference for Kinetic Modeling in PET
Authors
Qinlin Gu, Gaelle M. Emvalomenos, Evan D. Morris, Clara Grazian, Steven R. Meikle
Abstract
Dynamic PET kinetic modeling increasingly demands voxelwise uncertainty quantification and robust model selection. Yet total-body PET (TB-PET) data volumes make conventional Bayesian approaches, such as per-voxel MCMC, computationally impractical, while deep models typically require retraining and careful revalidation when tracers, protocols, or kinetic models change, without necessarily improving inference speed. Vectorized voxelwise approximate Bayesian computation (vPET-ABC) is introduced as a likelihood-free, model-agnostic posterior inference framework for dynamic PET kinetic modeling at total-body scale. The method replaces explicit likelihood evaluation with forward simulations and a discrepancy test, then exploits full vectorization to transform voxelwise inference into an embarrassingly parallel workload suited to modern GPUs. In simulation, vPET-ABC produced posterior summaries with small divergence from sequential Monte Carlo baselines, and posterior mean estimates significantly more accurate than non-negative least squares (NNLS). For model selection between the linear parametric neurotransmitter model (lp-ntPET) and the multilinear reference tissue model, vPET-ABC maintained high sensitivity under high noise with moderate loss of specificity, whereas NNLS+Bayesian information criteria exhibited the opposite trade-off with near-zero sensitivity. In a human cigarette smoking dataset, vPET-ABC yielded denser probabilistic activation maps than lp-ntPET with effective number of parameters. On a 50 min total-body [18F]FDG study, vPET-ABC generated high quality whole volume K_i parametric images within practical runtimes on a single GPU, while also preserved local spatial correlation better than NNLS. Overall, vPET-ABC delivers fast, training-free, uncertainty-aware inference that scales to TB-PET and remains portable across tracers and kinetic models.
Metadata
Related papers
Fractal universe and quantum gravity made simple
Fabio Briscese, Gianluca Calcagni • 2026-03-25
POLY-SIM: Polyglot Speaker Identification with Missing Modality Grand Challenge 2026 Evaluation Plan
Marta Moscati, Muhammad Saad Saeed, Marina Zanoni, Mubashir Noman, Rohan Kuma... • 2026-03-25
LensWalk: Agentic Video Understanding by Planning How You See in Videos
Keliang Li, Yansong Li, Hongze Shen, Mengdi Liu, Hong Chang, Shiguang Shan • 2026-03-25
Orientation Reconstruction of Proteins using Coulomb Explosions
Tomas André, Alfredo Bellisario, Nicusor Timneanu, Carl Caleman • 2026-03-25
The role of spatial context and multitask learning in the detection of organic and conventional farming systems based on Sentinel-2 time series
Jan Hemmerling, Marcel Schwieder, Philippe Rufin, Leon-Friedrich Thomas, Mire... • 2026-03-25
Raw Data (Debug)
{
"raw_xml": "<entry>\n <id>http://arxiv.org/abs/2603.14859v1</id>\n <title>vPET-ABC: Fast Voxelwise Approximate Bayesian Inference for Kinetic Modeling in PET</title>\n <updated>2026-03-16T05:57:14Z</updated>\n <link href='https://arxiv.org/abs/2603.14859v1' rel='alternate' type='text/html'/>\n <link href='https://arxiv.org/pdf/2603.14859v1' rel='related' title='pdf' type='application/pdf'/>\n <summary>Dynamic PET kinetic modeling increasingly demands voxelwise uncertainty quantification and robust model selection. Yet total-body PET (TB-PET) data volumes make conventional Bayesian approaches, such as per-voxel MCMC, computationally impractical, while deep models typically require retraining and careful revalidation when tracers, protocols, or kinetic models change, without necessarily improving inference speed.\n Vectorized voxelwise approximate Bayesian computation (vPET-ABC) is introduced as a likelihood-free, model-agnostic posterior inference framework for dynamic PET kinetic modeling at total-body scale. The method replaces explicit likelihood evaluation with forward simulations and a discrepancy test, then exploits full vectorization to transform voxelwise inference into an embarrassingly parallel workload suited to modern GPUs.\n In simulation, vPET-ABC produced posterior summaries with small divergence from sequential Monte Carlo baselines, and posterior mean estimates significantly more accurate than non-negative least squares (NNLS). For model selection between the linear parametric neurotransmitter model (lp-ntPET) and the multilinear reference tissue model, vPET-ABC maintained high sensitivity under high noise with moderate loss of specificity, whereas NNLS+Bayesian information criteria exhibited the opposite trade-off with near-zero sensitivity. In a human cigarette smoking dataset, vPET-ABC yielded denser probabilistic activation maps than lp-ntPET with effective number of parameters. On a 50 min total-body [18F]FDG study, vPET-ABC generated high quality whole volume K_i parametric images within practical runtimes on a single GPU, while also preserved local spatial correlation better than NNLS.\n Overall, vPET-ABC delivers fast, training-free, uncertainty-aware inference that scales to TB-PET and remains portable across tracers and kinetic models.</summary>\n <category scheme='http://arxiv.org/schemas/atom' term='stat.AP'/>\n <published>2026-03-16T05:57:14Z</published>\n <arxiv:comment>Q. Gu and G. M. Emvalomenos contributed equally to this work</arxiv:comment>\n <arxiv:primary_category term='stat.AP'/>\n <author>\n <name>Qinlin Gu</name>\n </author>\n <author>\n <name>Gaelle M. Emvalomenos</name>\n </author>\n <author>\n <name>Evan D. Morris</name>\n </author>\n <author>\n <name>Clara Grazian</name>\n </author>\n <author>\n <name>Steven R. Meikle</name>\n </author>\n </entry>"
}