Generalized Bayesian Inference for Regression-Type Models with an Intractable Normalizing Constant ()
ABSTRACT
Regression models with intractable normalizing constants are valuable tools for analyzing complex data structures, yet parameter inference for such models remains highly challenging—particularly when observations are discrete. In statistical inference, discrete state spaces introduce significant computational difficulties, as the normalizing constant often requires summation over extremely large or even infinite sets, which is typically infeasible in practice. These challenges are further compounded when observations are independent but not identically distributed. This paper addresses these issues by developing a novel generalized Bayesian inference approach tailored for regression models with intractable likelihoods. The key idea is to employ a specific form of generalized Fisher divergence to update beliefs about the model parameters, thereby circumventing the need to compute the normalizing constant. The resulting generalized posterior distribution can be sampled using standard computational tools, such as Markov Chain Monte Carlo (MCMC), effectively avoiding the intractability of the normalizing constant.
Share and Cite:
Gan, Q.Q. and Ye, W.Z. (2025) Generalized Bayesian Inference for
Regression-Type Models with an
Intractable Normalizing Constant.
Advances in Pure Mathematics,
15, 319-338. doi:
10.4236/apm.2025.155016.
Cited by
No relevant information.