Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
-
Updated
Oct 20, 2023 - Jupyter Notebook
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Implementation of the MNIST experiment for Monte Carlo Dropout from http://mlg.eng.cam.ac.uk/yarin/PDFs/NIPS_2015_bayesian_convnets.pdf
This repository reimplemented "MC Dropout" by tensorflow 2.0 Eager Extension.
An experimental Python package for learning Bayesian Neural Network.
(Forked Version) Experiments used in "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning"
We provide two notebooks that enable users to explore and experiment with some BDL techniques as Ensembles, MC Dropout and Laplace Approximation. In this way, they allow you to intuitively visualize the main differences among them in a Simulated Dataset and Boston Dataset.
A transformative approach to manufacturing optimization, focusing on the textile forming process. This research synergizes domain-specific knowledge with simulation modeling and introduces Bayesian optimization for efficient parameter space exploration.
Data Drift Analysis and Anomaly detection tools
Master thesis for the MSc. Artificial Intelligence at the Universiteit van Amsterdam, 2019
Numerical solution and uncertainty quantification of Pennes' bioheat transfer equation in 1-D using deep neural network solver.
Add a description, image, and links to the mc-dropout topic page so that developers can more easily learn about it.
To associate your repository with the mc-dropout topic, visit your repo's landing page and select "manage topics."