TITLE:
A Kullback-Leibler Divergence for Bayesian Model Diagnostics
AUTHORS:
Chen-Pin Wang, Malay Ghosh
KEYWORDS:
Kullback-Leibler Distance, Model Diagnostic, Weighted Posterior Predictive P-Value
JOURNAL NAME:
Open Journal of Statistics,
Vol.1 No.3,
October
20,
2011
ABSTRACT: This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes.