Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Sep 29, 2024 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)
Pretrained ELECTRA Model for Korean
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
Build and train state-of-the-art natural language processing models using BERT
Pytorch-Named-Entity-Recognition-with-transformers
“英特尔创新大师杯”深度学习挑战赛 赛道2:CCKS2021中文NLP地址要素解析
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
🤗 Korean Comments ELECTRA: 한국어 댓글로 학습한 ELECTRA 모델
AI and Memory Wall
Pytorch 트랜스포머 구현과 언어모델(BERT MLM, ELECTRA), 기계번역 테스트.
Baseline code for Korean open domain question answering(ODQA)
DBMDZ BERT, DistilBERT, ELECTRA, GPT-2 and ConvBERT models
Electra pre-trained model using Vietnamese corpus
中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Turkish-Reading-Comprehension-Question-Answering-Dataset
This little library generates raw IR commands for your Electra air conditioner (Raspberry Pi)
Add a description, image, and links to the electra topic page so that developers can more easily learn about it.
To associate your repository with the electra topic, visit your repo's landing page and select "manage topics."