- gradient_descent.py : An 'over-engineered(modular)' OOP implementation of gradient descent algorithms and its popular variants in python (+ numpy)
- Visualization.ipynb : Observing the behaviour of above implemented methods with animated visualizations in python (+ matplotlib)
- Vanilla Gradient Descent
- Momentum Gradient Descent (Inherits Vanilla Gradient Descent)
- Nesterov Gradient Descent (Inherits Momentum Gradient Descent)
- AdaGrad Gradient Descent (Inherits Vanilla Gradient Descent)
- RMSProp Gradient Descent (Inherits AdaGrad Gradient Descent)
- Adam Gradient Descent (Inherits Vanilla Gradient Descent) {Note: Avoided the 'logical' multiple inheritance from Momentum & RMSProp, because it looked messy}
- The implemented algorithms should also work with stochastic and batch gradient descent - have to confirm the same by testing whether a gradient function that yields the 'computed'(based on stochastic / batch version of gradient descent) gradient performs correctly.
- Include a cleaner code for dynamic update of ax.quiver in animation function once matplotlib/matplotlib#22407 is available.
- Include variants of Adam Gradient Descent (or other interesting Gradient Descent algorithms)