- [arXiv] Non-Autoregressive Translation with Layer-Wise Prediction and Deep Supervision
- [arXiv] MvSR-NAT: Multi-view Subset Regularization for Non-Autoregressive Machine Translation
- [CL] Sequence-Level Training for Non-Autoregressive Neural Machine Translation
- [EMNLP] Exploring Non-Autoregressive Text Style Transfer
- [EMNLP] Learning to Rewrite for Non-Autoregressive Neural Machine Translation
- [EMNLP] AligNART: Non-autoregressive Neural Machine Translation by Jointly Learning to Estimate Alignment and Translate
- [ICML] Order-Agnostic Cross Entropy for Non-Autoregressive Machine Translation
- [ICML] BANG: Bridging Autoregressive and Non-autoregressive Generation with Large Scale Pretraining
A pretrained model can simultaneously support AR, NAR and semi-NAR generation ; question generation , summarization , dialogue generation - [ACL] Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation
- [ACL] Progressive Multi-Granularity Training for Non-Autoregressive Translation
- [ACL] GLAT: Glancing Transformer for Non-Autoregressive Neural Machine Translation
- [ACL] POS-Constrained Parallel Decoding for Non-autoregressive Generation
- [ACL Findings] Fully Non-autoregressive Neural Machine Translation: Tricks of the Trade
- [ACL SRW] Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation
- [EACL] Enriching Non-Autoregressive Transformer with Syntactic and Semantic Structures for Neural Machine Translation
- [EACL] Non-Autoregressive Text Generation with Pre-trained Language Models
- [NAACL] Non-Autoregressive Semantic Parsing for Compositional Task-Oriented Dialog
- [NAACL] Non-Autoregressive Translation by Learning Target Categorical Codes
- [NAACL] Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation
- [ICLR] Understanding and Improving Lexical Choice in Non-Autoregressive Translation
alleviate the problem of the lexical choice errors on low-frequency words which are propagated to the NAT model from the teacher model ; Machine Translation - [AAAI] Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information
- [ICASSP] Parallel tacotron: Non-autoregressive and controllable tts
a non-autoregressive neural text-to-speech model augmented with a variational autoencoder based residual encoder
- [arXiv] Listen and Fill in the Missing Letters: Non-Autoregressive Transformer for Speech Recognition
two different non-autoregressive transformer structure for automatic speech recognition (ASR): A-CMLM and A-FMLM. - [arXiv] Non-Autoregressive Neural Dialogue Generation
- [arXiv] Improving Fluency of Non-Autoregressive Machine Translation
- [arXiv] Semi-Autoregressive Training Improves Mask-Predict Decoding
- [arXiv] LAVA NAT: A Non-Autoregressive Translation Model with Look-Around Decoding and Vocabulary Attention
- [IJCAI] Task-Level Curriculum Learning for Non-Autoregressive Neural Machine Translation
- [COLING] Context-Aware Cross-Attention for Non-Autoregressive Translation
- [COLING] Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism
- [NeurIPS] Incorporating BERT into Parallel Sequence Decoding with Adapters
- [EMNLP] Non-Autoregressive Machine Translation with Latent Alignments
- [EMNLP] Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation
- [EMNLP] SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling
- [INTERSPEECH] Mask CTC: Non-Autoregressive End-to-End ASR with CTC and Mask Predict
- [INTERSPEECH] Insertion-Based Modeling for End-to-End Automatic Speech Recognition
- [ACL] Learning to Recover from Multi-Modality Errors for Non-Autoregressive Neural Machine Translation
- [ACL] Jointly Masked Sequence-to-Sequence Model for Non-Autoregressive Neural Machine Translation
- [ACL] ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation
- [ACL] Improving Non-autoregressive Neural Machine Translation with Monolingual Data
- [ACL] A Study of Non-autoregressive Model for Sequence Generation
a study to understand the difficulty of NAR sequence generation - [ICML] Non-Autoregressive Neural Text-to-Speech
a non-autoregressive seq2seq model that converts text to speech - [ICML] Aligned Cross Entropy for Non-Autoregressive Machine Translation
propose aligned cross entropy (AXE) as an alternative loss function for training of non-autoregressive models ; Machine Translation - [ICML] Non-autoregressive Machine Translation with Disentangled Context Transformer
- [ICML] Imputer: Sequence Modelling via Imputation and Dynamic Programming
- [ICML] An EM Approach to Non-autoregressive Conditional Sequence Generation
- [ICLR] Understanding Knowledge Distillation in Non-autoregressive Machine Translation
- [AAAI] Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation
- [AAAI] Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference Using a Delta Posterior
a latent-variable non-autoregressive model with continuous latent variables and deterministic inference procedure Using a Delta Posterior ; Machine Translation - [AAAI] Fine-Tuning by Curriculum Learning for Non-Autoregressive Neural Machine Translation
design a curriculum in the fine-tuning process to progressively switch the training from autoregressive generation to non-autoregressive generation ; Machine Translation
- [PMLR] Insertion Transformer:flexible sequence generation via insertion operations
an iterative, partially autoregressive model for sequence generation based on insertion operations ; machine translation - [arXiv] Non-autoregressive Transformer by Position Learning
a non-autoregressive model which incorporates positions as a latent variable into the text generative process ; machine translation , paraphrase generation - [NeurIPS] Levenshtein Transformer
a non-autoregressive model whose the basic operations are insertion and deletion ; machine translation, text summarization , automatic post-editing - [NeurIPS] Fast Structured Decoding for Sequence Models
design an efficient approximation for Conditional Random Fields (CRF) for non-autoregressive sequence models ; Machine Translation - [NeurIPS] FastSpeech: Fast, Robust and Controllable Text to Speech
a novel feed-forward network based on Transformer to generate mel-spectrogram in parallel for TTS ; text to speech - [EMNLP] Mask-Predict: Parallel Decoding of Conditional Masked Language Models
first predict all of the target words non-autoregressively, and then repeatedly mask out and regenerate the subset of words ; Machine Translation - [EMNLP] FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
a non-autoregressive sequence generation using latent variable models ; Machine Translation - [EMNLP] Hint-Based Training for Non-Autoregressive Machine Translation
proposed a novel approach to leveraging the hints from hidden states and word alignments to help the training of NART models ; Machine Translation - [ACL] Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation
propose a sequence-level training method based on a novel reinforcement algorithm to reduce the variance and stabilize the training procedure and an innovative Transformer decoder named FS-decoder to fuse the target sequential information into the top layer of the decoder ; Machine Translation - [ACL] Imitation Learning for Non-Autoregressive Neural Machine Translation
an imitation learning framework for non-autoregressive machine translation - [AAAI] Non-Autoregressive Machine Translation with Auxiliary Regularization
address repeated translations and incomplete translations by improving the quality of decoder hidden representations via two auxiliary regularization terms in the training process of an NAT model ; Machine Translation - [AAAI] Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input
two methods to enhance the decoder inputs so as to improve NAT models ; Machine Translation
- [ICML] Fast Decoding in Sequence Models Using Discrete Latent Variables
first autoencode the target sequence into a shorter sequence of discrete latent variables autoregressively and finally decode the output sequence from this shorter latent sequence in parallel ; Machine Translation - [EMNLP] Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
a conditional non-autoregressive neural sequence model based on iterative refinement ; machine translation , image caption generation - [EMNLP] End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification
a novel non-autoregressive architecture based on connectionist temporal classification and can be trained end-to-end ; Machine Translation - [ICLR] Non-Autoregressive Neural Machine Translation
Machine Translation
Changhan Wang (wangchanghan@gmail.com)
新增
- []
- [AAAI] Flexible Non-Autoregressive Extractive Summarization with Threshold: How to Extract a Non-Fixed Number of Summary Sentences.
- [ACL findings] A Non-Autoregressive Edit-Based Approach to Controllable Text Simplification
- [ACL findings] Investigating the Reordering Capability in CTC-based Non-Autoregressive End-to-End Speech Translation
- [ACL findings] NAST: A Non-Autoregressive Generator with Word Alignment for Unsupervised Text Style Transfer.
- [ACL] Tail-to-Tail Non-Autoregressive Sequence Prediction for Chinese Grammatical Error Correction.
- [ACL] How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
- [EMNLP] Maximal Clique Based Non-Autoregressive Open Information Extraction
- [EMNLP findings] Span Pointer Networks for Non-Autoregressive Task-Oriented Semantic Parsing
- [EMNLP] Thinking Clearly, Talking Fast: Concept-Guided Non-Autoregressive Generation for Open-Domain Dialogue Systems
- [ICASSP] CASS-NAT: CTC Alignment-Based Single Step Non-Autoregressive Transformer for Speech Recognition
- [ICASSP] Non-Autoregressive Sequence-To-Sequence Voice Conversion
- [ICASSP] Improved Mask-CTC for Non-Autoregressive End-to-End ASR
- [ICASSP] ORTHROS: non-autoregressive end-to-end speech translation With dual-decoder
- [ICASSP] Non-Autoregressive Transformer ASR with CTC-Enhanced Decoder Input
- [ICLR] Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation
- [ICLR] Bidirectional Variational Inference for Non-Autoregressive Text-to-Speech
- [NAACL] Align-Refine: Non-Autoregressive Speech Recognition via Iterative Realignment
- [WMT@EMNLP] The Volctrans GLAT System: Non-autoregressive Translation Meets WMT21