无约束优化的超记忆梯度法及其全局收敛性
Convergence of Supermemory Gradient Method for Unconstrained Optimization and its Global Convergence
-
摘要: 提出一类新的求解无约束优化问题的超记忆梯度法,并在较弱条件下证明了算法的全局收敛性.当目标函数为一致凸函数时,对其线性收敛速度进行了分析.Abstract: This paper presents a new class of supermemory gradient methods for unconstraind optimization problems and proves its global convergence under some mild conditions.The linear convergence rate is investigated when the objective function is uniformly convex.