Beyond the Markovian Assumption: Robust Optimization via Fractional Weyl Integrals in Imbalanced Data
AI 摘要
提出基于分数阶微积分的优化算法,解决不平衡数据中的过拟合问题。
主要贡献
- 提出基于分数阶Weyl积分的优化算法
- 利用历史梯度信息作为正则化
- 在金融欺诈检测中PR-AUC提高40%
方法论
利用加权分数阶Weyl积分替换瞬时梯度,构建动态加权历史序列,作为优化算法的核心。
原文摘要
Standard Gradient Descent and its modern variants assume local, Markovian weight updates, making them highly susceptible to noise and overfitting. This limitation becomes critically severe in extremely imbalanced datasets such as financial fraud detection where dominant class gradients systematically overwrite the subtle signals of the minority class. In this paper, we introduce a novel optimization algorithm grounded in Fractional Calculus. By isolating the core memory engine of the generalized fractional derivative, the Weighted Fractional Weyl Integral, we replace the instantaneous gradient with a dynamically weighted historical sequence. This fractional memory operator acts as a natural regularizer. Empirical evaluations demonstrate that our method prevents overfitting in medical diagnostics and achieves an approximately 40 percent improvement in PR-AUC over classical optimizers in financial fraud detection, establishing a robust bridge between pure fractional topology and applied Machine Learning.