All the code files related to the deep learning course from PadhAI
-
Updated
Apr 13, 2020 - Jupyter Notebook
All the code files related to the deep learning course from PadhAI
AutoInit: Analytic Signal-Preserving Weight Initialization for Neural Networks
A module for making weights initialization easier in pytorch.
A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.
PREDICT THE BURNED AREA OF FOREST FIRES WITH NEURAL NETWORKS
Neural_Networks_From_Scratch
How weight initialization affects forward and backward passes of a deep neural network
FloydHub porting of deeplearning.ai course assignments
Neural Networks: Zero to Hero. I completed the tutorial series by Andrej Karpathy
Excel file and Python code used in the published SLR paper: RNN-LSTM: From Applications to Modeling Techniques and Beyond - Systematic Review
Making a Deep Learning Framework with C++
Use ML-FLOW and TensorFlow2.0(Keras) to record all the experiments on the Fashion MNIST dataset.
Neural Network
MachineLearningCurves is a collection of abstract papers, insights, and research notes focusing on various topics in machine learning.
Playground for trials, attempts and small projects.
This code implements neural network from scratch without using any library
Deep Learning with TensorFlow Keras and PyTorch
Data driven initialization for neural network models
Variance normalising pre-training of neural networks.
Comapring different methods of weight initialization and optimizers using PyTorch
Add a description, image, and links to the weight-initialization topic page so that developers can more easily learn about it.
To associate your repository with the weight-initialization topic, visit your repo's landing page and select "manage topics."