Numerical Algorithms ( IF 2.1 ) Pub Date : 2023-08-03 , DOI: 10.1007/s11075-023-01574-1 Sandy Bitterlich , Sorin-Mihai Grad
For minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in an Euclidean space we propose a stochastic incremental mirror descent algorithm constructed by means of the Nesterov smoothing. Further, we modify the algorithm in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Next, a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing is proposed in order to minimize over a nonempty closed convex set in an Euclidean space a sum of finitely many proper, convex and lower semicontinuous functions and a prox-friendly proper, convex and lower semicontinuous function. Different to the previous contributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modelled as optimization problems illustrate the theoretical achievements
中文翻译:
具有 Nesterov 平滑的随机增量镜像下降算法
为了最小化欧几里德空间中非空闭凸集上的有限多个真凸函数和下半连续函数的总和,我们提出了一种通过 Nesterov 平滑构造的随机增量镜像下降算法。此外,我们修改了算法,以便在欧几里德空间中的非空闭凸集上最小化由线性算子组成的有限多个真函数、凸函数和下半连续函数的总和。接下来,提出了一种具有 Nesterov 平滑的随机增量镜像下降 Bregman 近端方案,以便在欧几里得空间中的非空闭凸集上最小化有限多个真凸函数和下半连续函数的总和以及一个近似友好的真凸函数和下半连续函数。与之前关于最小化函数和的镜像下降方法的文献的贡献不同,我们不要求它们是(Lipschitz)连续或可微的。以优化问题建模的物流、断层扫描和机器学习中的应用说明了理论成果