This Specialization is a foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. In this Specialization, you will build and train neural network architectures such as Convolutional Neural Networks, Recurrent Neural Networks, LSTMs, and Transformers and learn how to make them better with strategies such as Dropout, BatchNorm, Xavier/He initialization, and more. Get ready to master theoretical concepts and their industry applications using Python and TensorFlow and tackle real-world cases such as speech recognition, music synthesis, chatbots, machine translation, natural language processing, and more.
Topics Covered:
- Build and train deep neural networks, implement vectorized neural networks, identify architecture parameters, and apply DL to your applications.
- Use best practices to train and develop test sets and analyze bias/variance for building DL applications, use standard NN techniques, apply optimization algorithms, and implement a neural network in TensorFlow.
- Use strategies for reducing errors in ML systems, understand complex ML settings, and apply end-to-end, transfer, and multi-task learning.
- Build a Convolutional Neural Network, apply it to visual detection and recognition tasks, use neural style transfer to generate art, and apply these algorithms to image video, and other 2D/3D data.
- Build and train Recurrent Neural Networks and their variants (GRUs, LSTMs), apply RNNs to character-level language modeling, work with NLP and Word Embeddings, and use HuggingFace tokenizers and transformers to perform Named Entity Recognition and question-answering
Course 1: Neural Networks and Deep Learning
Artificial Neural Networks; Deep Learning; Backpropagation; Python programming
Course 2: Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
Mathematical optimization; Hyperparameter tuning
Course 3: Structuring Machine Learning Projects
Inductive Transfer; Machine Learning; Multi-task learning; Decision-making
Course 4: Convolutional Neural Networks
Facial Recognition system; Convolutional Neural Network architecture; Object Detection and Segmentation
Long Short-Term Memory (LSTM); Gated Recurrent Unit (GRU); Recurrent Neural Networks (RNN); Attention Models