Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
-
Updated
Sep 23, 2020 - Python
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
University project on the optimization of neural networks through hyper gradient based algorithms.
Add a description, image, and links to the hypergradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the hypergradient-descent topic, visit your repo's landing page and select "manage topics."