Skip to content

shaiTheKimhi/OptimizationAlgorithms

Repository files navigation

OptimizationAlgorithms

This repository explores different Descent-based optimization algorithms and tests them over different functions including over a Neural Network.
The optimization algorithms are Gradient Descent, Newton Method and BFGS (L-BFGS, Quasi-Newton and conjugate gradients to be added..).
The functions optimized include:
A basic quadratic function f(x) = xQx^t with custom matrix Q
Rosenbrock function
A Neural Network approximation function trying to approximate function f(x) = x*exp(-x1^2-x2^2)
The results figures can be seen at files "dnn and bfgs report.pdf" and "GD and Newton report.pdf"

About

Optimization Algorithms

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages