This repository explores different Descent-based optimization algorithms and tests them over different functions including over a Neural Network.
The optimization algorithms are Gradient Descent, Newton Method and BFGS (L-BFGS, Quasi-Newton and conjugate gradients to be added..).
The functions optimized include:
A basic quadratic function f(x) = xQx^t with custom matrix Q
Rosenbrock function
A Neural Network approximation function trying to approximate function f(x) = x*exp(-x1^2-x2^2)
The results figures can be seen at files "dnn and bfgs report.pdf" and "GD and Newton report.pdf"
-
Notifications
You must be signed in to change notification settings - Fork 0
shaiTheKimhi/OptimizationAlgorithms
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Optimization Algorithms
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published