Skip to content

Latest commit

 

History

History
41 lines (27 loc) · 4.22 KB

README.md

File metadata and controls

41 lines (27 loc) · 4.22 KB

Tutorials

In this folder you can find a collection of useful Notebooks containing several tutorials to understand the principles and the potential of ATHENA.

Tutorial 1 [.ipynb, .py, .html]

Here we show a basic application of active subspaces on a simple model in order to reconstruct and analyze it.

Tutorial 2 [.ipynb, .py, .html]

Here we focus on a crucial step of the procedure: the evaluation of the gradients of the model with respect to the inputs. We show two possible methods to approximate the gradients given pairs of input-output datasets: Gaussian process regression, which makes use of GPy, and local linear gradients, which is implemented in ATHENA. In Tutorial 5 we will use adjoint methods to reconstruct the gradients.

Tutorial 3 [.ipynb, .py, .html]

Here we exploit the presence of an active subspace to build one-dimensional response surfaces with Gaussian processes. We compare the choice of the original model as a profile of the ridge approximation with the choice of the optimal profile. It requires GPy and pyhmc for Hamiltonian Monte Carlo.

[Tutorial 4][.ipynb, .py, .html]

Here we show an application of the active subspaces property to speed up the sampling from the posterior of an inverse problem with Gaussian prior and likelihood. We use the library Pyro for probabilistic programming and Hamiltonian Monte Carlo and GPy for Gaussian process regression.

[Tutorial 5][.ipynb, .py, .html]

You need to run Tutorial 5, solver first. Here we show how an active subspace can be searched for in the case of a model with vectorial outputs. We use fenics to solve a Poisson problem with red noise in the diffusion coefficient (approximated with truncated Karhunen-Loève decomposition). If you want to look at the active eigenvectors and K-L modes after having ran the tutorial, open Tutorial 5, visualization tool.

Tutorial 6 [.ipynb, .py, .html]

Here we show how a kernel-based active subspace can be detected and employed when a standard active subspace is missing. We also describe the tuning procedure involved.

Tutorial 7 [.ipynb, .py, .html]

We present the nonlinear level-set learning (NLL) technique and we compare it with AS.

More to come...

We plan to add more tutorials but the time is often against us. If you want to contribute with a notebook on a feature not covered yet we will be very happy and give you support on editing!

References

The main references for these tutorials are