Function-Space Empirical Bayes Regularisation with Student's t Priors
AI 摘要
提出了一种新的函数空间经验贝叶斯正则化框架,使用Student's t先验提高不确定性估计的鲁棒性。
主要贡献
- 提出了ST-FS-EB框架,使用Student's t先验进行函数空间正则化
- 在参数和函数空间都使用了重尾分布
- 通过变分推断和MC dropout优化目标函数
方法论
使用函数空间经验贝叶斯正则化,结合Student's t先验和变分推断,通过MC dropout近似后验分布。
原文摘要
Bayesian deep learning (BDL) has emerged as a principled approach to produce reliable uncertainty estimates by integrating deep neural networks with Bayesian inference, and the selection of informative prior distributions remains a significant challenge. Various function-space variational inference (FSVI) regularisation methods have been presented, assigning meaningful priors over model predictions. However, these methods typically rely on a Gaussian prior, which fails to capture the heavy-tailed statistical characteristics inherent in neural network outputs. By contrast, this work proposes a novel function-space empirical Bayes regularisation framework -- termed ST-FS-EB -- which employs heavy-tailed Student's $t$ priors in both parameter and function spaces. Also, we approximate the posterior distribution through variational inference (VI), inducing an evidence lower bound (ELBO) objective based on Monte Carlo (MC) dropout. Furthermore, the proposed method is evaluated against various VI-based BDL baselines, and the results demonstrate its robust performance in in-distribution prediction, out-of-distribution (OOD) detection and handling distribution shifts.