Transformer-based models implemented in tensorflow 2.x(using keras).
-
Updated
Nov 23, 2021 - Python
Transformer-based models implemented in tensorflow 2.x(using keras).
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
Recent Advances in Vision-Language Pre-training!
Code to reproduce experiments from the paper "Continual Pre-Training Mitigates Forgetting in Language and Vision" https://arxiv.org/abs/2205.09357
Sample tutorials for training Natural Language Processing Models with Transformers
Comparing Selective Masking Methods for Depression Detection in Social Media
😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
[CHIL 2024] Interpretation of Intracardiac Electrograms Through Textual Representations
A transformer-based language model trained on politics-related Twitter data. This repo is the official resource of the paper "PoliBERTweet: A Pre-trained Language Model for Analyzing Political Content on Twitter", LREC 2022
Grammar test suite for masked language models
A Context Aware Approach for Generating Natural Language Attacks.
Transformers Intuition
Predict the whole sequence and 3D structure of masked protein sequences with ESM by @evolutionaryscale
[EMNLP 2024] Official Implementation of DisGeM: Distractor Generation for Multiple Choice Question with Span Masking
Score masked language models on grammar test suites
Final assigment for "Gestione dell'Informazione" ("Search Engines") course @ UniMoRe
Transformers Pre-Training with MLM objective — implemented encoder-only model and trained from scratch on Wikipedia dataset.
Training the first Cypriot Large Language Model on the Masked Language Modeling objective for predicting a given masked word token within a given context
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Add a description, image, and links to the masked-language-models topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-models topic, visit your repo's landing page and select "manage topics."