A placeholder for my work in word embeddings and knowledge graph representations
Word embeddings are vecotrial representation of words/text in such a way that similar words or words in same context will appear closer in the higher dimensional space than words with different meaning or used out of context.
Here I use an auxiliary graph to enhance word embeddings creation (word2Vec) using connections of the graph as additional context in the training process.