一个新的求解无约束优化问题的超记忆梯度法

A New Super-memory Gradient Method for Solving Unconstrained Optimization Problems

  • 摘要: 提出一个新的求解无约束优化问题的超记忆梯度法.该算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用曲线搜索产生步长,并且在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题.在较弱的条件下证明了算法具有全局收敛性和线性收敛速度.数值实验表明该算法是有效的

     

    Abstract: A new super-memory gradient method for solving unconstrained optimization problems was proposed. A new search direction was generated by using the current and previous multi-step iterative information in the proposed method, and the step-size at each iteration was defined by a curve search rule, which can be used to solve large scale unconstrained optimization problems because it avoids the computation and storage of some matrices. Furthermore, the global convergence and the convergence rate of the new algorithm were proved under some weak conditions. Finally, numerical results showed that the proposed algorithm is effective

     

/

返回文章
返回