Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Word Embedding Approaches #22

Open
imsanjoykb opened this issue Nov 7, 2020 · 0 comments
Open

Word Embedding Approaches #22

imsanjoykb opened this issue Nov 7, 2020 · 0 comments
Labels
documentation Improvements or additions to documentation

Comments

@imsanjoykb
Copy link
Owner

One of the reasons that Natural Language Processing is a difficult problem to solve is the fact that, unlike human beings, computers can only understand numbers. We have to represent words in a numeric format that is understandable by the computers. Word embedding refers to the numeric representations of words.

Several word embedding approaches currently exist and all of them have their pros and cons. We will discuss three of them here:

Bag of Words
TF-IDF Scheme
Word2Vec

@imsanjoykb imsanjoykb added the documentation Improvements or additions to documentation label Nov 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

1 participant