This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
-
Updated
Sep 23, 2021 - Python
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Configurable Encoder-Decoder Sequence-to-Sequence model. Built with TensorFlow.
Chapter 9: Attention and Memory Augmented Networks
Tensorflow 2.0 tutorials for RNN based architectures for textual problems
A multi-layer bidirectional seq-2-seq chatbot with bahdanau attention.
Generate captions from images
Image Captioning is the process of generating textual description of an image. It uses both Natural Language Processing and Computer Vision to generate the captions.
Master Project on Image Captioning using Supervised Deep Learning Methods
A simple and easy to understand NLP teaching
Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.
Neural Machine Translation by Jointly Learning to Align and Translate paper implementation
Implementation of GRU-based Encoder-Decoder Architecture with Bahdanau Attention Mechanism for Machine Translation from German to English.
Bangla Conversational Chatbot using Bidirectional LSTM with Attention Mechanism
s-atmech is an independent Open Source, Deep Learning python library which implements attention mechanism as a RNN(Recurrent Neural Network) Layer as Encoder-Decoder system. (only supports Bahdanau Attention right now).
A language translator based on a very simple NLP Transformer model, backed by encoder, decoder and a Bahdanau Attention Layer in between, implemented on TensorFlow.
ExpressNet is a ready-to-use, weightless text classification model architecture that you can import and start training immediately.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Implemented an Encoder-Decoder model in TensorFlow, where ResNet-50 extracts features from the VizWiz-Captions image dataset and a GRU with Bahdanau attention generates captions.
Solution for the Quora Insincere Questions Classification Kaggle competition.
Natural Language Processing
Add a description, image, and links to the bahdanau-attention topic page so that developers can more easily learn about it.
To associate your repository with the bahdanau-attention topic, visit your repo's landing page and select "manage topics."