This folder contains implementations of different methods for finding extrema in mathematical functions. Each method has its own strengths and may be suitable for different types of optimization problems.
Methods Included:
- Gradient Method with Step Splitting
The gradient method with step splitting is an iterative optimization algorithm that uses the gradient (first-order derivative) of a function to find its minimum.
- Fastest Descent Method (Gradient Descent Method)
The fastest descent method, commonly known as the gradient descent method, is an iterative optimization algorithm used for finding the minimum of a function. It involves taking steps proportional to the negative of the gradient of the function at the current point.
- Newton's Method
Newton's method is an iterative numerical technique for finding the roots (or minima/maxima) of a function. It employs second-order derivative information in its optimization process.
- Conjugate Gradient Method
The conjugate gradient method is an iterative technique for solving systems of linear equations, which can also be applied to optimization problems. It combines aspects of the gradient descent method and direct methods for solving linear systems.
- Coordinate Descent Method
The coordinate descent method is an optimization algorithm that updates one variable at a time, holding the others fixed. It iteratively minimizes the function with respect to each variable.