Gradient estimators for parameter inference in discrete stochastic kinetic models
AI 摘要
论文研究了离散随机动力学模型中基于梯度的参数推断方法。
主要贡献
- 评估了三种梯度估计器在Gillespie算法中的表现
- 揭示了不同估计器的优缺点和适用场景
- 验证了基于梯度的参数推断与Gillespie算法的有效结合
方法论
比较了Gumbel-Softmax Straight-Through, Score Function和Alternative Path三种梯度估计器在不同动力学系统中的表现。
原文摘要
Stochastic kinetic models are ubiquitous in physics, yet inferring their parameters from experimental data remains challenging. In deterministic models, parameter inference often relies on gradients, as they can be obtained efficiently through automatic differentiation. However, these tools cannot be directly applied to stochastic simulation algorithms (SSA) such as the Gillespie algorithm, since sampling from a discrete set of reactions introduces non-differentiable operations. In this work, we adopt three gradient estimators from machine learning for the Gillespie SSA: the Gumbel-Softmax Straight-Through (GS-ST) estimator, the Score Function estimator, and the Alternative Path estimator. We compare the properties of all estimators in two representative systems exhibiting relaxation or oscillatory dynamics, where the latter requires gradient estimation of time-dependent objective functions. We find that the GS-ST estimator mostly yields well-behaved gradient estimates, but exhibits diverging variance in challenging parameter regimes, resulting in unsuccessful parameter inference. In these cases, the other estimators provide more robust, lower variance gradients. Our results demonstrate that gradient-based parameter inference can be integrated effectively with the Gillespie SSA, with different estimators offering complementary advantages.