Fully batched seq2seq example based on practical-pytorch, and more extra features.
-
Updated
Mar 11, 2018 - Jupyter Notebook
Fully batched seq2seq example based on practical-pytorch, and more extra features.
This library provides functionality for rapidly sharing and retrieving word embeddings over the internet. (EMNLP 2017).
The Transformer model implemented from scratch using PyTorch. The model uses weight sharing between the embedding layers and the pre-softmax linear layer. Training on the Multi30k machine translation task is shown.
Pytorch implementation of "Multi-domain translation between single-cell imaging and sequencing data using autoencoders" (https://www.nature.com/articles/s41467-020-20249-2) with custom models.
Add a description, image, and links to the shared-embedding topic page so that developers can more easily learn about it.
To associate your repository with the shared-embedding topic, visit your repo's landing page and select "manage topics."