- Naive Bayes classifier for categorical data from scratch in Python
- Naive Bayes classifier for continuous data from scratch in Python
- Data Visualization: Showing Iris dataset with Blender API
- Norms in vector space: A review of norms, and reminding p-norms are included. Finally, we compare some special p-norms.
- Inner products in vector space: Reminding dot product and Frobenius inner product, and then canonical norms based on them. There are examples with module numpy.
- Gram-Schmidt process: An algorithm to convert a linearly independent set of vectors into an orthogonal set of vectors.
- Boxplot: The elements of a boxplot are reviewed here, including: medians, quartiles, fences, and outliers.
- Probability, standard terms: such as sample space, trial, outcome, and event.
- Logisitic function: It is an S-shaped curve, which is widely used in machine learning and neural networks.
- Sigmoid functions (curves): Some examples are included. They are widely used in neural networks and deep learning.
- Conditional probability: We review the conditional probability and based on it, we get the multiplication rule.
- Inclusion-exclusion principle: We review this principle both in set theory and in probability. Python code is also provided.
- Probability, independent events: The property of independent events are mentioned here. Also, multiplication rule is included with some examples.
- Probability, Bayes' rule: The Bayes' rule is expressed here along the total probability theorem. Bayes' rule is defined by conditional probabilities. Some Python code are included too.
- Linear Regression with Least Squares: When we assume the data points are related through a linear function, we can predict the dependent variable from independent variabe(s). This is a lienar regression. One way to find the parameters of a linear regression is to use a Least Squares estimator. The related Python code clarifies this topic.
- Ridge Regression with Least Squares: Ridge regression is an extension of linear regression in which a penalty term is included in the loss function. This penalty term is called regularization term. Ridge regression is especially useful when data points are noisy and/or having outliers. It also shows robustness against overfitting.
-
Notifications
You must be signed in to change notification settings - Fork 0
ostad-ai/Machine-Learning
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This repository contains topics and codes related to Machine Learning and Data Science, especially in Python
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published