PyTorch implementation of some attentions for Deep Learning Researchers.
-
Updated
Mar 4, 2022 - Python
PyTorch implementation of some attentions for Deep Learning Researchers.
A set of notebooks that explores the power of Recurrent Neural Networks (RNNs), with a focus on LSTM, BiLSTM, seq2seq, and Attention.
LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
Add a description, image, and links to the additive-attention topic page so that developers can more easily learn about it.
To associate your repository with the additive-attention topic, visit your repo's landing page and select "manage topics."