During this exercise, you will implement gradient descent to solve optimization problems. For all activities, the task is to find the lowest point on a function surface or in mathematical language
All modules in src
require your attention.
- To get started, take a look at
src/optimize_1d.py
. Use the gradient to find the minumum of the parabola starting from five, or in other words
The problem is illustrated below:
- Next we consider a paraboloid,
$\cdot$ denotes the scalar product,
The paraboloid is already implemented in src/optimize_2d.py
.
Your task is to solve this problem using two-dimensional gradient descent.
Once more the problem is illustrated below:
- Additionally we consider a bumpy paraboloid,
$\cdot$ denotes the scalar product,
The addtional sin and cosine terms will require momentum for convergence.
The bumpy paraboloid is already implemented in src/optimize_2d_momentum_bumpy.py
.
Your task is to solve this problem using two-dimensional gradient descent with momentum.
Once more the problem is illustrated below:
- Finally, to explore the automatic differentiation functionality we consider the problem,
The function is already defined in src/optimize_2d_momentum_bumpy_torch.py
. We dont have to find the gradient by hand!
Use torch.func.grad
(torch-documentation) to compute the gradient automatically. Use the result to find the minimum using momentum.
While coding use nox -s test
, nox -s lint
, and nox -s typing
to check your code.
Autoformatting help is available via nox -s format
.
Feel free to read mode about nox at https://nox.thea.codes/en/stable/ .