- Predict categories based on MLE
- Probability : Something Happening / Everything that could Happen
- Odds : Something Happening / Something Not Happening
- Log(Odds) : To make Odds output symmetry
- Indepth Logistic Output explained
- Assumption 1 - Appropriate outcome type (Must be categorical)
- Assumption 2 - Linearity of independent variables and log odds
- Assumption 3 - No strongly influential outliers
- Assumption 4 - Absence of multicollinearity
- Assumption 5 - Independence of observations
- Assumption 6 - Sufficiently large sample size
- Python Code for Logistic Regression Assumptions
- Akaike Information Criterion
- Bayesian Information Criterion
- Choose the lowest score
- Python Code for Logistic Regression
- One vs All (OvA) also known as One vs Rest (OvR)
- One vs One (OnO)
- Python Code for Multi Class Classification
- L1 Lasso
- SSR + lamda * (slope)^2
- Useless variable become 0
- L2 Ridge
- SSR + lamda * |slope|
- Useless variable tends to become 0 but never = 0
- Elastic Net : Combination of L1 & L2
- Python Code of Regularization (L1 Lasso,L2 Ridge & Elastic Net)
- Weight of Evidence : Predictive power of Independent Variables
- Information Value : Technique to select important Variables
- Python Code for WOE and IV
- Logistic Regression Revision
- Logistic Regression Interview quesion bank
- Indepth Logistic Output explained