Numerical Results for Gauss-Seidel Iterative Algorithm Based on Newton Methods for Unconstrained Optimization Problems

Main Article Content

Nguyen Dinh Dung

Abstract

Optimization problems play a crucial role in various fields such as economics, engineering, and computer science. They involve finding the best value (maximum or minimum) of an objective function. In unconstrained optimization problems, the goal is to find a point where the function’s value reaches a maximum or minimum without being restricted by any conditions. Currently, there are many different methods to solve unconstrained optimization problems, one of which is the Newton method. This method is based on using a second-order Taylor series expansion to approximate the objective function. By calculating the first derivative (gradient) and second derivative (Hessian matrix) of the function, the Newton method determines the direction and step size to find the extrema. This method has a very fast convergence rate when near the solution and is particularly effective for problems with complex mathematical structures. In this paper, we introduce a Gauss-Seidel-type algorithm implemented for the Newton and Quasi-Newton methods, which is an efficient approach for finding solutions to optimization problems when the objective function is a convex functional. We also present some computational results for the algorithm to illustrate the convergence of the method.

Article Details