Skip to content

Ayan2134/VisualTales-Image-Caption-Generator

Repository files navigation

VisualTales-Image-Caption-Generator

Week 1

Week 2

Resources :

These are my course slides and are a little more maths centric so if you want you can just read them once.

You can also check this to get a high level understanding of both : link

Video Lectures :

You can check out these videos it will give you some good visualization and intuition.

Week 3

The main ojective of this week is to learn about basic Artificial Neural Networks and the fundamentals of optimisers. We'll be using either pytorch, keras or tensorflow for the implementaion of neural nets. Don't spend too much time on pytorch syntax.

Resources:

In case someone prefers keras or tensorflow. You don't have to learn all three frameworks, just go for any one of these.

Week 4

The main objective of this week is to learn about Convolutional Neural Networks (CNNs), explore their architectures, and understand different CNN models. We'll be using frameworks like NumPy, PyTorch, TensorFlow, and Keras.

Recap on Neural Networks

Refer to the links below for a quick recap:

For more in-depth knowledge, you can look for similar videos on YouTube. Here is a playlist on Neural Networks by 3Blue1Brown.

Basics of CNNs

  1. Introduction to CNNs:

CNN Architectures

  1. Explore these Popular CNN Architectures:

Implementation

For this week's assignment, we will be using NumPy, PyTorch, and TensorFlow. It is advised that you are comfortable with these libraries beforehand.

Video Lectures

  1. Video Tutorials:

Optional Deep Dive

  1. Advanced Topics (Optional):

Week 5

This week, we'll delve into the fascinating world of Recurrent Neural Networks (RNNs) and explore how they can be used to process sequences of data, such as sentences and text.

Coursera Course

Strongly Recommended

Complete the first two weeks of the course "Sequence Models" by Andrew Ng on Coursera.

This course covers essential concepts related to sequential neural networks and word embeddings. You can audit the content for free, providing a solid theoretical foundation for the topics we'll cover this week.

Understanding RNNs

Recurrent Neural Networks (RNNs) are designed to process sequences of data. Learn the basics of RNNs and how they maintain a memory of past inputs:

We will also explore Natural Language Processing (NLP), a field that focuses on the interaction between computers and humans using natural language. Understanding the fundamentals of text processing, language modeling, and sentiment analysis will be crucial for tasks such as chatbots, language translation, and text summarization.

NLP (Natural Language Processing)

CONTENTS

  • Word Embeddings
  • Text Preprocessing
  • Sentiment Analysis

Word Embeddings

Neural networks cannot process words directly; they deal only with numerical vectors and their computations. To feed text as input to a neural network, we need to convert it into vector form using word embeddings. Various techniques (TF-IDF, Skip-gram, CBOW) and implementations (Glove, FastText, etc.) exist for this purpose.

Read these articles:

Text Preprocessing

Text preprocessing in NLP is essential to clean and transform raw text data, addressing issues like irrelevant characters, formatting, and inconsistencies to ensure its suitability for analysis by machine learning models.

Main Topics to Keep in Mind:

  • Tokenization
  • Lowercase conversion
  • Stopwords removal
  • Stemming
  • Lemmatization

RESOURCES

Sentiment Analysis

Sentiment analysis in NLP involves determining the emotional tone or subjective information expressed in a piece of text, helping identify and quantify sentiments such as positive, negative, or neutral attitudes.

NLTK vs spaCy

NLTK is versatile and suitable for learning and experimenting with various NLP concepts, making it popular in academic and research settings. spaCy, with its emphasis on speed and ease of use, is favored in the industry for developing efficient and scalable NLP applications. The choice between NLTK and spaCy depends on the specific needs of a project and the user's goals, whether it be educational exploration or real-world application development.

TEXT PROCESSING WITH SPACY

RESOURCES

Main Topics to Keep in Mind:

  • Tokenization
  • Parts of Speech Tagging
  • Named Entity Recognition

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published