Skip to content

Latest commit

 

History

History
28 lines (24 loc) · 2.1 KB

README.md

File metadata and controls

28 lines (24 loc) · 2.1 KB

gradient-descents

Contents:

  • gradient_descent.py : An 'over-engineered(modular)' OOP implementation of gradient descent algorithms and its popular variants in python (+ numpy)
  • Visualization.ipynb : Observing the behaviour of above implemented methods with animated visualizations in python (+ matplotlib)

Algorithms implemented:

  • Vanilla Gradient Descent
    GradientDescent
  • Momentum Gradient Descent (Inherits Vanilla Gradient Descent)
    MomentumGradientDescent
  • Nesterov Gradient Descent (Inherits Momentum Gradient Descent)
    NesterovGradientDescent1D
    NesterovGradientDescent2D
  • AdaGrad Gradient Descent (Inherits Vanilla Gradient Descent)
  • RMSProp Gradient Descent (Inherits AdaGrad Gradient Descent)
    RMSPropGradientDescent
  • Adam Gradient Descent (Inherits Vanilla Gradient Descent) {Note: Avoided the 'logical' multiple inheritance from Momentum & RMSProp, because it looked messy}
    AdamGradientDescent

Future plans:

  • The implemented algorithms should also work with stochastic and batch gradient descent - have to confirm the same by testing whether a gradient function that yields the 'computed'(based on stochastic / batch version of gradient descent) gradient performs correctly.
  • Include a cleaner code for dynamic update of ax.quiver in animation function once matplotlib/matplotlib#22407 is available.
  • Include variants of Adam Gradient Descent (or other interesting Gradient Descent algorithms)