Advances in Pure Mathematics

Volume 15, Issue 5 (May 2025)

ISSN Print: 2160-0368   ISSN Online: 2160-0384

Google-based Impact Factor: 0.48  Citations  

Generalized Bayesian Inference for Regression-Type Models with an Intractable Normalizing Constant

  XML Download Download as PDF (Size: 873KB)  PP. 319-338  
DOI: 10.4236/apm.2025.155016    81 Downloads   402 Views  
Author(s)

ABSTRACT

Regression models with intractable normalizing constants are valuable tools for analyzing complex data structures, yet parameter inference for such models remains highly challenging—particularly when observations are discrete. In statistical inference, discrete state spaces introduce significant computational difficulties, as the normalizing constant often requires summation over extremely large or even infinite sets, which is typically infeasible in practice. These challenges are further compounded when observations are independent but not identically distributed. This paper addresses these issues by developing a novel generalized Bayesian inference approach tailored for regression models with intractable likelihoods. The key idea is to employ a specific form of generalized Fisher divergence to update beliefs about the model parameters, thereby circumventing the need to compute the normalizing constant. The resulting generalized posterior distribution can be sampled using standard computational tools, such as Markov Chain Monte Carlo (MCMC), effectively avoiding the intractability of the normalizing constant.

Share and Cite:

Gan, Q.Q. and Ye, W.Z. (2025) Generalized Bayesian Inference for Regression-Type Models with an Intractable Normalizing Constant. Advances in Pure Mathematics, 15, 319-338. doi: 10.4236/apm.2025.155016.

Cited by

No relevant information.

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.