[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
-
Updated
Jul 25, 2024 - Python
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
PyTorch implementation of some attentions for Deep Learning Researchers.
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
Exploring attention weights in transformer-based models with linguistic knowledge.
"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
A Faster Pytorch Implementation of Multi-Head Self-Attention
Visualization for simple attention and Google's multi-head attention.
Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)
Attention-based Induction Networks for Few-Shot Text Classification
Self-Supervised Vision Transformers for multiplexed imaging datasets
several types of attention modules written in PyTorch
This is the official repository of the original Point Transformer architecture.
The original transformer implementation from scratch. It contains informative comments on each block
Performance of the C++ interface of flash attention and flash attention v2 in large language model (LLM) inference scenarios.
Decoding Attention is specially optimized for multi head attention (MHA) using CUDA core for the decoding stage of LLM inference.
EMNLP 2018: Multi-Head Attention with Disagreement Regularization; NAACL 2019: Information Aggregation for Multi-Head Attention with Routing-by-Agreement
Sentence encoder and training code for Mean-Max AAE
Collection of different types of transformers for learning purposes
Add a description, image, and links to the multi-head-attention topic page so that developers can more easily learn about it.
To associate your repository with the multi-head-attention topic, visit your repo's landing page and select "manage topics."