Generative Bayesian Computation (GBC) provides a simulation-based approach to Bayesian inference. A Quantile Neural Network (QNN) is trained to map samples from a base distribution to the posterior distribution. Our method applies equally to parametric and likelihood-free models. By generating a large training dataset of parameter-output pairs inference is recast as a supervised learning problem of non-parametric regression. Generative quantile methods have a number of advantages over traditional approaches such as approximate Bayesian computation (ABC) or GANs. Primarily, quantile architectures are density-free and exploit feature selection using dimensionality reducing summary statistics. To illustrate our methodology, we analyze the classic normal-normal learning model and apply it to two real data problems, modeling traffic speed and building a surrogate model for a satellite drag dataset. We compare our methodology to state-of-the-art approaches. Finally, we conclude with directions for future research.
Generative AI for Bayesian Computation.
阅读:15
作者:Polson Nick, Sokolov Vadim
| 期刊: | Entropy | 影响因子: | 2.000 |
| 时间: | 2025 | 起止号: | 2025 Jun 26; 27(7):683 |
| doi: | 10.3390/e27070683 | ||
特别声明
1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。
2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。
3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。
4、投稿及合作请联系:info@biocloudy.com。
