-
Intro to numerical optimization methods. Gradient descent (ru, en)
-
How to accelerate gradient descent: conjugate gradient method, heavy-ball method and fast gradient method (ru, en)
-
Second order methods: Newton method. Quasi-Newton methods as trade-off between convergence speed and cost of one iterations (ru, en)
-
Non-smooth optimization problems: subgradient methods and intro to proximal methods (ru, en)
-
Least squares problem: matrix factorizations and Levenberg-Marquardt algorithm (ru, en)
-
Smoothing: smooth minimization of non-smooth functions (original paper) (ru, en)
-
Simple constrained optimization problems: projected gradient method and Frank-Wolfe method (ru, en)
-
General purpose solvers: interior point methods (ru, en)
-
How to parallelize optimization methods: penalty method, augmented Lagrangian method and ADMM (ru, en)