一类新的求解非线性方程组的记忆梯度法
A New Memory Gradient Method for Solving Nonlinear Equations
-
摘要: 提出一类新的求解非线性方程组的记忆梯度法,证明了算法的全局收敛性.该算法不依赖于问题初始点的选取,并且在迭代过程中无需计算雅克比矩阵的逆矩阵,降低了算法的计算量,节省了运算时间.与牛顿法相比,新算法更适于求解大规模非线性方程组.Abstract: A new memory gradient method for solving nonlinear equations is presented and its global convergence is proved.The method does not depend on the initial value and is not required to calculate the inverse matrix of Jacobian matrix.By this way,the method reduces the computational amount of the algorithm and saves the computing time.It is more suitable to solve large scale nonlinear equations than the Newton’s method.