From 7bf9a19094e663145047955b9a7e52f71166de74 Mon Sep 17 00:00:00 2001 From: Will Dean <57733339+wd60622@users.noreply.github.com> Date: Mon, 30 Sep 2024 08:27:52 -0400 Subject: [PATCH] Add new example for Unsupport Distribution (#145) * add example back in * reorganize example --- docs/examples/unsupported-distributions.md | 55 ++++++++++++++-------- mkdocs.yml | 2 +- 2 files changed, 37 insertions(+), 20 deletions(-) diff --git a/docs/examples/unsupported-distributions.md b/docs/examples/unsupported-distributions.md index 6b6c9ac..be6bfd5 100644 --- a/docs/examples/unsupported-distributions.md +++ b/docs/examples/unsupported-distributions.md @@ -3,26 +3,44 @@ comments: true --- # Unsupported Posterior Predictive Distributions - -Suppose we want to use the geometric model with a beta prior which doesn't have a +Suppose we want to use the Pareto model with a Gamma prior which doesn't have a supported distribution for the posterior predictive. -```python -from conjugate.distributions import Beta -from conjugate.models import geometric_beta - -prior = Beta(1, 1) -posterior: Beta = geometric_beta(x_total=12, n=10, prior=prior) -``` - We can get posterior predictive samples by: 1. Sample from the posterior distribution 2. Sample from the model distribution using posterior samples -## 1. Using `conjugate-models` +## Setup + +```python +import numpy as np + +from conjugate.distributions import Gamma, Pareto +from conjugate.models import pareto_gamma + +seed = sum(map(ord, "Unsupported Posterior Predictive Distributions")) +rng = np.random.default_rng(seed) -This is easy to do with this package. +n = 10 + +x_m = 1 +alpha = 2.5 +true_distribution = Pareto(x_m=x_m, alpha=alpha) + +data = true_distribution.dist.rvs(size=n, random_state=rng) + +prior = Gamma(1, 1) +posterior: Gamma = pareto_gamma( + n=n, + ln_x_total=np.log(data).sum(), + x_m=x_m, + prior=prior, +) +``` + + +## 1. Using `conjugate-models` Since the distributions are vectorized, just: @@ -30,22 +48,21 @@ Since the distributions are vectorized, just: 2. Take a single sample from the model distribution ```python -from conjugate.distributions import Geometric - n_samples = 1_000 -posterior_samples = posterior.dist.rvs(size=n_samples) -posterior_predictive_samples = Geometric(p=posterior_samples).dist.rvs() + +alpha_samples = posterior.dist.rvs(size=n_samples, random_state=rng) +posterior_predictive_samples = Pareto(x_m=x_m, alpha=alpha_samples).dist.rvs(random_state=rng) ``` -## 2. Using `pymc` +## 2. Using PyMC Another route would be using PyMC then use the `draw` function. ```python import pymc as pm -posterior_dist = pm.Beta.dist(alpha=posterior.alpha, beta=posterior.beta) -geometric_posterior_predictive = pm.Geometric.dist(posterior_dist) +posterior_alpha = pm.Gamma.dist(alpha=posterior.alpha, beta=posterior.beta) +geometric_posterior_predictive = pm.Pareto.dist(m=x_m, alpha=posterior_alpha) n_samples = 1_000 posterior_predictive_samples = pm.draw(geometric_posterior_predictive, draws=n_samples) diff --git a/mkdocs.yml b/mkdocs.yml index 35d33f9..b2ee2ef 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -49,7 +49,7 @@ nav: - Bayesian Update: examples/bayesian-update.md - Thompson Sampling: examples/thompson.md - Linear Regression: examples/linear-regression.md - # - Unsupported Distributions: examples/unsupported-distributions.md + - Unsupported Distributions: examples/unsupported-distributions.md - Inference in SQL: examples/sql.md - Bootstrap Comparison: examples/bootstrap.md - Features: