Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Quarto GHA Workflow Runner committed Oct 26, 2023
1 parent b0d8cbc commit 9a6b6ab
Show file tree
Hide file tree
Showing 7 changed files with 66 additions and 72 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
38a419c5
50e3d627
22 changes: 8 additions & 14 deletions schedule/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -512,20 +512,14 @@ <h2 class="anchored" data-anchor-id="unsupervised-learning">5 Unsupervised learn
</section>
<section id="f-final-exam" class="level2">
<h2 class="anchored" data-anchor-id="f-final-exam">F Final exam</h2>
<p>Date and time TBD.</p>
<div class="callout callout-style-default callout-important callout-titled">
<div class="callout-header d-flex align-content-center">
<div class="callout-icon-container">
<i class="callout-icon"></i>
</div>
<div class="callout-title-container flex-fill">
Important
</div>
</div>
<div class="callout-body-container callout-body">
<p>Do not make any plans to leave Vancouver before the final exam date is announced.</p>
</div>
</div>
<p><strong>Monday, December 18 at 12-2pm, location TBA</strong></p>
<!--
::: {.callout-important}
Do not make any plans to leave Vancouver before the final exam date is announced.
:::
-->
<ul>
<li>In person attendance is required (per Faculty of Science guidelines)</li>
<li>You must bring your computer as the exam will be given through Canvas</li>
Expand Down
8 changes: 4 additions & 4 deletions schedule/slides/00-gradient-descent.html
Original file line number Diff line number Diff line change
Expand Up @@ -397,7 +397,7 @@
<h2>00 Gradient descent</h2>
<p><span class="secondary">Stat 406</span></p>
<p><span class="secondary">Daniel J. McDonald</span></p>
<p>Last modified – 16 October 2023</p>
<p>Last modified – 25 October 2023</p>
<p><span class="math display">\[
\DeclareMathOperator*{\argmin}{argmin}
\DeclareMathOperator*{\argmax}{argmax}
Expand Down Expand Up @@ -471,7 +471,7 @@ <h2>Very basic example</h2>
</section>
<section id="why-does-this-work" class="slide level2">
<h2>Why does this work?</h2>
<p><span class="secendary">Heuristic interpretation:</span></p>
<p><span class="secondary">Heuristic interpretation:</span></p>
<ul>
<li><p>Gradient tells me the slope.</p></li>
<li><p>negative gradient points toward the minimum</p></li>
Expand Down Expand Up @@ -508,11 +508,11 @@ <h2>What <span class="math inline">\(\gamma\)</span>? (more details than we have
<li>Usually does not work</li>
</ul>
<p><span class="secondary">Decay on a schedule</span></p>
<p><span class="math inline">\(\gamma_{k+1} = \frac{\gamma_k}{1+ck}\)</span> or <span class="math inline">\(\gamma_{k} = \gamma_0 b^k\)</span></p>
<p><span class="math inline">\(\gamma_{n+1} = \frac{\gamma_n}{1+cn}\)</span> or <span class="math inline">\(\gamma_{n} = \gamma_0 b^n\)</span></p>
<p><span class="secondary">Exact line search</span></p>
<ul>
<li>Tells you exactly how far to go.</li>
<li>At each <span class="math inline">\(k\)</span>, solve <span class="math inline">\(\gamma_k = \arg\min_{s \geq 0} f( x^{(k)} - s f(x^{(k-1)}))\)</span></li>
<li>At each iteration <span class="math inline">\(n\)</span>, solve <span class="math inline">\(\gamma_n = \arg\min_{s \geq 0} f( x^{(n)} - s f(x^{(n-1)}))\)</span></li>
<li>Usually can’t solve this.</li>
</ul>
</section>
Expand Down
4 changes: 2 additions & 2 deletions schedule/slides/16-logistic-regression.html
Original file line number Diff line number Diff line change
Expand Up @@ -397,7 +397,7 @@
<h2>16 Logistic regression</h2>
<p><span class="secondary">Stat 406</span></p>
<p><span class="secondary">Daniel J. McDonald</span></p>
<p>Last modified – 09 October 2023</p>
<p>Last modified – 25 October 2023</p>
<p><span class="math display">\[
\DeclareMathOperator*{\argmin}{argmin}
\DeclareMathOperator*{\argmax}{argmax}
Expand Down Expand Up @@ -446,7 +446,7 @@ <h2>Direct model</h2>
<p><span class="math display">\[
\begin{aligned}
Pr(Y = 1 \given X=x) &amp; = \frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}} \\
\P(Y = 0 | X=x) &amp; = \frac{1}{1 + \exp\{\beta_0 + \beta^{\top}x\}}=1-\frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}}
Pr(Y = 0 | X=x) &amp; = \frac{1}{1 + \exp\{\beta_0 + \beta^{\top}x\}}=1-\frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}}
\end{aligned}
\]</span></p>
<p>This is logistic regression.</p>
Expand Down
6 changes: 3 additions & 3 deletions schedule/slides/17-nonlinear-classifiers.html
Original file line number Diff line number Diff line change
Expand Up @@ -397,7 +397,7 @@
<h2>17 Nonlinear classifiers</h2>
<p><span class="secondary">Stat 406</span></p>
<p><span class="secondary">Daniel J. McDonald</span></p>
<p>Last modified – 09 October 2023</p>
<p>Last modified – 26 October 2023</p>
<p><span class="math display">\[
\DeclareMathOperator*{\argmin}{argmin}
\DeclareMathOperator*{\argmax}{argmax}
Expand Down Expand Up @@ -429,8 +429,8 @@ <h2>17 Nonlinear classifiers</h2>
<h2>Last time</h2>
<p>We reviewed logistic regression</p>
<p><span class="math display">\[\begin{aligned}
\P(Y = 1 \given X=x) &amp; = \frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}} \\
\P(Y = 0 \given X=x) &amp; = \frac{1}{1 + \exp\{\beta_0 + \beta^{\top}x\}}=1-\frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}}\end{aligned}\]</span></p>
Pr(Y = 1 \given X=x) &amp; = \frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}} \\
Pr(Y = 0 \given X=x) &amp; = \frac{1}{1 + \exp\{\beta_0 + \beta^{\top}x\}}=1-\frac{\exp\{\beta_0 + \beta^{\top}x\}}{1 + \exp\{\beta_0 + \beta^{\top}x\}}\end{aligned}\]</span></p>
</section>
<section id="make-it-nonlinear" class="slide level2">
<h2>Make it nonlinear</h2>
Expand Down
14 changes: 7 additions & 7 deletions search.json
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@
"href": "schedule/slides/16-logistic-regression.html#meta-lecture",
"title": "UBC Stat406 2023W",
"section": "16 Logistic regression",
"text": "16 Logistic regression\nStat 406\nDaniel J. McDonald\nLast modified – 09 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
"text": "16 Logistic regression\nStat 406\nDaniel J. McDonald\nLast modified – 25 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
},
{
"objectID": "schedule/slides/16-logistic-regression.html#last-time",
Expand All @@ -354,7 +354,7 @@
"href": "schedule/slides/16-logistic-regression.html#direct-model",
"title": "UBC Stat406 2023W",
"section": "Direct model",
"text": "Direct model\nInstead, let’s directly model the posterior\n\\[\n\\begin{aligned}\nPr(Y = 1 \\given X=x) & = \\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}} \\\\\n\\P(Y = 0 | X=x) & = \\frac{1}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}=1-\\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}\n\\end{aligned}\n\\]\nThis is logistic regression."
"text": "Direct model\nInstead, let’s directly model the posterior\n\\[\n\\begin{aligned}\nPr(Y = 1 \\given X=x) & = \\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}} \\\\\nPr(Y = 0 | X=x) & = \\frac{1}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}=1-\\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}\n\\end{aligned}\n\\]\nThis is logistic regression."
},
{
"objectID": "schedule/slides/16-logistic-regression.html#why-this",
Expand Down Expand Up @@ -1432,7 +1432,7 @@
"href": "schedule/slides/00-gradient-descent.html#meta-lecture",
"title": "UBC Stat406 2023W",
"section": "00 Gradient descent",
"text": "00 Gradient descent\nStat 406\nDaniel J. McDonald\nLast modified – 16 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
"text": "00 Gradient descent\nStat 406\nDaniel J. McDonald\nLast modified – 25 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
},
{
"objectID": "schedule/slides/00-gradient-descent.html#simple-optimization-techniques",
Expand Down Expand Up @@ -1481,7 +1481,7 @@
"href": "schedule/slides/00-gradient-descent.html#what-gamma-more-details-than-we-have-time-for",
"title": "UBC Stat406 2023W",
"section": "What \\(\\gamma\\)? (more details than we have time for)",
"text": "What \\(\\gamma\\)? (more details than we have time for)\nWhat to use for \\(\\gamma_k\\)?\nFixed\n\nOnly works if \\(\\gamma\\) is exactly right\nUsually does not work\n\nDecay on a schedule\n\\(\\gamma_{k+1} = \\frac{\\gamma_k}{1+ck}\\) or \\(\\gamma_{k} = \\gamma_0 b^k\\)\nExact line search\n\nTells you exactly how far to go.\nAt each \\(k\\), solve \\(\\gamma_k = \\arg\\min_{s \\geq 0} f( x^{(k)} - s f(x^{(k-1)}))\\)\nUsually can’t solve this."
"text": "What \\(\\gamma\\)? (more details than we have time for)\nWhat to use for \\(\\gamma_k\\)?\nFixed\n\nOnly works if \\(\\gamma\\) is exactly right\nUsually does not work\n\nDecay on a schedule\n\\(\\gamma_{n+1} = \\frac{\\gamma_n}{1+cn}\\) or \\(\\gamma_{n} = \\gamma_0 b^n\\)\nExact line search\n\nTells you exactly how far to go.\nAt each iteration \\(n\\), solve \\(\\gamma_n = \\arg\\min_{s \\geq 0} f( x^{(n)} - s f(x^{(n-1)}))\\)\nUsually can’t solve this."
},
{
"objectID": "schedule/slides/00-gradient-descent.html#section",
Expand Down Expand Up @@ -2160,7 +2160,7 @@
"href": "schedule/index.html#f-final-exam",
"title": " Schedule",
"section": "F Final exam",
"text": "F Final exam\nDate and time TBD.\n\n\n\n\n\n\nImportant\n\n\n\nDo not make any plans to leave Vancouver before the final exam date is announced.\n\n\n\nIn person attendance is required (per Faculty of Science guidelines)\nYou must bring your computer as the exam will be given through Canvas\nPlease arrange to borrow one from the library if you do not have your own. Let me know ASAP if this may pose a problem.\nYou may bring 2 sheets of front/back 8.5x11 paper with any notes you want to use. No other materials will be allowed.\nThere will be no required coding, but I may show code or output and ask questions about it.\nIt will be entirely multiple choice / True-False / matching, etc. Delivered on Canvas."
"text": "F Final exam\nMonday, December 18 at 12-2pm, location TBA\n\n\nIn person attendance is required (per Faculty of Science guidelines)\nYou must bring your computer as the exam will be given through Canvas\nPlease arrange to borrow one from the library if you do not have your own. Let me know ASAP if this may pose a problem.\nYou may bring 2 sheets of front/back 8.5x11 paper with any notes you want to use. No other materials will be allowed.\nThere will be no required coding, but I may show code or output and ask questions about it.\nIt will be entirely multiple choice / True-False / matching, etc. Delivered on Canvas."
},
{
"objectID": "schedule/slides/00-cv-for-many-models.html#meta-lecture",
Expand Down Expand Up @@ -3546,14 +3546,14 @@
"href": "schedule/slides/17-nonlinear-classifiers.html#meta-lecture",
"title": "UBC Stat406 2023W",
"section": "17 Nonlinear classifiers",
"text": "17 Nonlinear classifiers\nStat 406\nDaniel J. McDonald\nLast modified – 09 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
"text": "17 Nonlinear classifiers\nStat 406\nDaniel J. McDonald\nLast modified – 26 October 2023\n\\[\n\\DeclareMathOperator*{\\argmin}{argmin}\n\\DeclareMathOperator*{\\argmax}{argmax}\n\\DeclareMathOperator*{\\minimize}{minimize}\n\\DeclareMathOperator*{\\maximize}{maximize}\n\\DeclareMathOperator*{\\find}{find}\n\\DeclareMathOperator{\\st}{subject\\,\\,to}\n\\newcommand{\\E}{E}\n\\newcommand{\\Expect}[1]{\\E\\left[ #1 \\right]}\n\\newcommand{\\Var}[1]{\\mathrm{Var}\\left[ #1 \\right]}\n\\newcommand{\\Cov}[2]{\\mathrm{Cov}\\left[#1,\\ #2\\right]}\n\\newcommand{\\given}{\\ \\vert\\ }\n\\newcommand{\\X}{\\mathbf{X}}\n\\newcommand{\\x}{\\mathbf{x}}\n\\newcommand{\\y}{\\mathbf{y}}\n\\newcommand{\\P}{\\mathcal{P}}\n\\newcommand{\\R}{\\mathbb{R}}\n\\newcommand{\\norm}[1]{\\left\\lVert #1 \\right\\rVert}\n\\newcommand{\\snorm}[1]{\\lVert #1 \\rVert}\n\\newcommand{\\tr}[1]{\\mbox{tr}(#1)}\n\\newcommand{\\brt}{\\widehat{\\beta}^R_{s}}\n\\newcommand{\\brl}{\\widehat{\\beta}^R_{\\lambda}}\n\\newcommand{\\bls}{\\widehat{\\beta}_{ols}}\n\\newcommand{\\blt}{\\widehat{\\beta}^L_{s}}\n\\newcommand{\\bll}{\\widehat{\\beta}^L_{\\lambda}}\n\\]"
},
{
"objectID": "schedule/slides/17-nonlinear-classifiers.html#last-time",
"href": "schedule/slides/17-nonlinear-classifiers.html#last-time",
"title": "UBC Stat406 2023W",
"section": "Last time",
"text": "Last time\nWe reviewed logistic regression\n\\[\\begin{aligned}\n\\P(Y = 1 \\given X=x) & = \\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}} \\\\\n\\P(Y = 0 \\given X=x) & = \\frac{1}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}=1-\\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}\\end{aligned}\\]"
"text": "Last time\nWe reviewed logistic regression\n\\[\\begin{aligned}\nPr(Y = 1 \\given X=x) & = \\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}} \\\\\nPr(Y = 0 \\given X=x) & = \\frac{1}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}=1-\\frac{\\exp\\{\\beta_0 + \\beta^{\\top}x\\}}{1 + \\exp\\{\\beta_0 + \\beta^{\\top}x\\}}\\end{aligned}\\]"
},
{
"objectID": "schedule/slides/17-nonlinear-classifiers.html#make-it-nonlinear",
Expand Down
Loading

0 comments on commit 9a6b6ab

Please sign in to comment.