Skip to content

Latest commit

 

History

History
56 lines (44 loc) · 2.76 KB

README.md

File metadata and controls

56 lines (44 loc) · 2.76 KB

cbow2

CBOW

CBOW or Continous bag of words is to use embeddings in order to train a neural network where the context is represented by multiple words for a given target words.

For example, we could use “cat” and “tree” as context words for “climbed” as the target word. This calls for a modification to the neural network architecture. The modification, shown below, consists of replicating the input to hidden layer connections C times, the number of context words, and adding a divide by C operation in the hidden layer neurons.

The CBOW architecture is pretty simple contains :

  • the word embeddings as inputs (idx)
  • the linear model as the hidden layer
  • the log_softmax as the output

Input:

sentence = "we are about to study the idea of computational"

Input is context words around centered target word: [context <--, context <-, target, context ->, context -->] in form ([context], target):

[(['we', 'are', 'to', 'study'], 'about'), (['are', 'about', 'study', 'the'], 'to'), (['about', 'to', 'the', 'idea'], 'study'), (['to', 'study', 'idea', 'of'], 'the'), (['study', 'the', 'of', 'computational'], 'idea')]

Trained verification input:

# (['we', 'are', 'to', 'study'], 'about')
word = predict(['we', 'are', 'to', 'study'])

Output:

word ='about'

Dataset:

Training on train-nn.txt, embedded_size=100, windowed_sz=4, input data.size=3298

<< loss : 98.180715
<< sucess: 95.451788

Reference