diff --git a/lectures/greek_square.md b/lectures/greek_square.md index db14f12b..d7c13887 100644 --- a/lectures/greek_square.md +++ b/lectures/greek_square.md @@ -217,10 +217,10 @@ $$ where $\eta_1$ and $\eta_2$ are chosen to satisfy the prescribed initial conditions $y_{-1}, y_{-2}$: $$ -\begin{align} +\begin{aligned} \lambda_1^{-1} \eta_1 + \lambda_2^{-1} \eta_2 & = y_{-1} \cr \lambda_1^{-2} \eta_1 + \lambda_2^{-2} \eta_2 & = y_{-2} -\end{align} +\end{aligned} $$(eq:leq_sq) System {eq}`eq:leq_sq` of simultaneous linear equations will play a big role in the remainder of this lecture. diff --git a/lectures/markov_chains_II.md b/lectures/markov_chains_II.md index 67def2d7..8124e0c1 100644 --- a/lectures/markov_chains_II.md +++ b/lectures/markov_chains_II.md @@ -4,7 +4,7 @@ jupytext: extension: .md format_name: myst format_version: 0.13 - jupytext_version: 1.14.4 + jupytext_version: 1.16.1 kernelspec: display_name: Python 3 (ipykernel) language: python @@ -248,8 +248,6 @@ Hence we expect that $\hat p_n(x) \approx \psi^*(x)$ when $n$ is large. The next figure shows convergence of $\hat p_n(x)$ to $\psi^*(x)$ when $x=1$ and $X_0$ is either $0, 1$ or $2$. - - ```{code-cell} ipython3 P = np.array([[0.971, 0.029, 0.000], [0.145, 0.778, 0.077], @@ -330,6 +328,8 @@ for i in range(n): axes[i].plot(p_hat, label=f'$x_0 = \, {x0} $') axes[i].legend() + +plt.tight_layout() plt.show() ``` @@ -395,8 +395,6 @@ ax.legend() plt.show() ``` - - ### Expectations of geometric sums Sometimes we want to compute the mathematical expectation of a geometric sum, such as