This repository contains various implementations and projects related to the Computational Intelligence course at Amirkabir University of Technology.
This repository focuses on implementing different neural network architectures and machine learning techniques, covering fundamental topics in computational intelligence. The following models and concepts are implemented:
- Single Perceptron
- One-layer Multi Perceptron
- Multi-layer Perceptron
- PyTorch implementations for neural networks
A basic implementation of a single-layer perceptron, demonstrating how a simple neural network can classify linearly separable data.
Extending the single perceptron model to handle multi-class classification using multiple perceptrons in one layer.
A fully connected neural network with multiple hidden layers, capable of solving more complex, non-linear classification problems.
For the above models, PyTorch versions are provided to make use of powerful libraries for deep learning, allowing for efficient model training and scaling.
Familiarity with the following advanced neural network techniques is demonstrated:
- Data Augmentation: Techniques to increase the diversity of training data to improve model generalization.
- Batch Normalization: A method to normalize the inputs of each layer in order to speed up training and make the network more stable.
- Dropout: A regularization technique to prevent overfitting by randomly dropping units during training.
- Activation Functions: Implementation of various activation functions like ReLU, Sigmoid, Tanh, and others to introduce non-linearity into the models.
An implementation of the K-Means clustering algorithm, which groups data points into clusters based on feature similarity.