ATHENA: Advanced Techniques for High dimensional parameter spaces to Enhance Numerical Analysis
- Description
- Dependencies and installation
- Documentation
- Testing
- Examples and tutorials
- How to cite
- Authors and contributors
- How to contribute
- License
ATHENA is a Python package for reduction of high dimensional parameter spaces in the context of numerical analysis. It allows the use of several dimensionality reduction techniques such as Active Subspaces (AS), Kernel-based Active Subspaces (KAS), and Nonlinear Level-set Learning (NLL). It is particularly suited for the study of parametric PDEs, for sensitivity analysis, and for the approximation of engineering quantities of interest. It can handle both scalar and vectorial high dimensional functions, making it a useful tool also to reduce the burden of computational intensive optimization tasks.
See the Examples and Tutorials section below and the tutorials folder to have an idea of the potential of this package. Check also out the SISSA mathLab medium publication where you can find stories about ATHENA (search within the publication page).
ATHENA requires numpy
, matplotlib
, scipy
, torch
, GPyOpt
,
scikit-learn
, scikit-learn-extra
, sphinx
(for the documentation) and pytest
(for local test).
The code is compatible with Python 3.8 and above. It can be installed directly
from the source code or via pip.
To install the latest release of the package, which corresponds to the online documentation, just type:
> pip install athena-mathlab
To install the latest version of the package just type:
> pip install git+https://github.com/mathLab/ATHENA.git
The official distribution is on GitHub, and you can clone the repository using
> git clone https://github.com/mathLab/ATHENA
To install your own local branch you can use the setup.py
file
> pip install -e .
To uninstall the package you have to rerun the installation and record the installed files in order to remove them:
> pip uninstall athena-mathlab
ATHENA uses Sphinx for code documentation. You can view the documentation online here. To build the html version of the docs locally simply:
> cd docs
> make html
The generated html can be found in docs/build/html
. Open up the index.html
you find there to browse.
We are using Github Actions for continuous intergration testing. You can check out the current status here.
To run tests locally (pytest
is required):
> pytest
You can find useful tutorials on how to use the package in the tutorials folder.
In the following some illustrative examples of what you can do with this package.
A 7-dimensional quadratic function with a 1-dimensional active subspace.
Parameter space deformation using the nonlinear level-set learning technique for a 2-dimensional cubic function.
If you use this package in your publications please cite the package as follows:
F. Romor, M. Tezzele, and G. Rozza. ATHENA: Advanced Techniques for High dimensional parameter spaces to Enhance Numerical Analysis. Software Impacts, 10:100133, 2021. doi:10.1016/j.simpa.2021.100133
Or if you use LaTeX:
@article{romor2020athena,
author = {Romor, Francesco and Tezzele, Marco and Rozza, Gianluigi},
doi = {10.1016/j.simpa.2021.100133},
journal = {Software Impacts},
pages = {100133},
title = {{ATHENA: Advanced Techniques for High dimensional parameter spaces to Enhance Numerical Analysis}},
volume = {10},
year = {2021}}
To implement the numerical methods present in this package we followed these works:
-
Constantine. Active subspaces: Emerging ideas for dimension reduction in parameter studies. Volume 2 SIAM Spotlights, 2015. [DOI].
-
Constantine et al. Python Active-subspaces Utility Library, Journal of Open Source Software, 1(5), 79, 2016. [DOI].
-
Romor, Tezzele, Lario, Rozza. Kernel-based Active Subspaces with application to CFD parametric problems using Discontinuous Galerkin method. 2022. [DOI]
-
Zhang, Zhang, Hinkle. Learning nonlinear level sets for dimensionality reduction in function approximation. Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada. [arXiv].
-
Romor, Tezzele, Rozza. A local approach to parameter space reduction for regression and classification tasks. 2021. [arXiv].
Here there is a list of the scientific works involving ATHENA you can consult and/or cite. If you want to add one, please open a PR.
-
Tezzele, Romor, Rozza. Reduction in Parameter Space, in Advanced Reduced Order Methods and Applications in Computational Fluid Dynamics. 2022. [DOI].
-
Tezzele, Fabris, Sidari, Sicchiero, Rozza. A multi-fidelity approach coupling parameter space reduction and non-intrusive POD with application to structural optimization of passenger ship hulls. 2022. [DOI].
-
Teng, Wang, Ju, Gruber, Zhang. Level set learning with pseudo-reversible neural networks for nonlinear dimension reduction in function approximation. 2021. [arXiv].
-
Romor, Tezzele, Rozza. A local approach to parameter space reduction for regression and classification tasks. 2021. [arXiv].
-
Meneghetti, Demo, Rozza. A Dimensionality Reduction Approach for Convolutional Neural Networks. 2021. [arXiv].
-
Demo, Tezzele, Mola, Rozza. Hull shape design optimization with parameter space and model reductions and self-learning mesh morphing. 2021. [DOI], [arXiv].
-
Romor, Tezzele, Rozza. Multi-fidelity data fusion for the approximation of scalar functions with low intrinsic dimensionality through active subspaces. 2020. [DOI], [arXiv].
-
Romor, Tezzele, Lario, Rozza. Kernel-based Active Subspaces with application to CFD parametric problems using Discontinuous Galerkin method. 2022. [DOI]
-
Demo, Tezzele, Rozza. A supervised learning approach involving active subspaces for an efficient genetic algorithm in high-dimensional optimization problems. 2020. [DOI], [arXiv].
-
Tezzele, Demo, Stabile, Mola, Rozza. Enhancing CFD predictions in shape design problems by model and parameter space reduction. 2020. [DOI], [arXiv].
ATHENA is currently developed and mantained at SISSA mathLab by
under the supervision of Prof. Gianluigi Rozza.
Contact us by email for further information or questions about ATHENA, or suggest pull requests. Contributions improving either the code or the documentation are welcome!
We'd love to accept your patches and contributions to this project. There are just a few small guidelines you need to follow.
-
It's generally best to start by opening a new issue describing the bug or feature you're intending to fix. Even if you think it's relatively minor, it's helpful to know what people are working on. Mention in the initial issue that you are planning to work on that bug or feature so that it can be assigned to you.
-
Follow the normal process of forking the project, and setup a new branch to work in. It's important that each group of changes be done in separate branches in order to ensure that a pull request only includes the commits related to that bug or feature.
-
To ensure properly formatted code, please make sure to use 4 spaces to indent the code. The easy way is to run on your bash the provided script: ./code_formatter.sh. You should also run pylint over your code. It's not strictly necessary that your code be completely "lint-free", but this will help you find common style issues.
-
Any significant changes should almost always be accompanied by tests. The project already has good test coverage, so look at some of the existing tests if you're unsure how to go about it. We're using coveralls that is an invaluable tools for seeing which parts of your code aren't being exercised by your tests.
-
Do your best to have well-formed commit messages for each change. This provides consistency throughout the project, and ensures that commit messages are able to be formatted properly by various git tools.
-
Finally, push the commits to your fork and submit a pull request. Please, remember to rebase properly in order to maintain a clean, linear git history.
See the LICENSE file for license rights and limitations (MIT).