Infinity is a high-throughput, low-latency serving engine for text-embeddings, reranking models, clip, clap and colpali
-
Updated
Dec 18, 2024 - Python
Infinity is a high-throughput, low-latency serving engine for text-embeddings, reranking models, clip, clap and colpali
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
NLP for human. A fast and easy-to-use natural language processing (NLP) toolkit, satisfying your imagination about NLP.
Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
pretrained BERT model for cyber security text, learned CyberSecurity Knowledge
This repository contains PyTorch implementation for the baseline models from the paper Utterance-level Dialogue Understanding: An Empirical Study
This is a repo of basic Machine Learning what I learn. More to go...
文本相似度,语义向量,文本向量,text-similarity,similarity, sentence-similarity,BERT,SimCSE,BERT-Whitening,Sentence-BERT, PromCSE, SBERT
ColBERT humor dataset for the task of humor detection, containing 200,000 jokes/news
Electra pre-trained model using Vietnamese corpus
Implementation of ICLR 21 paper: Probing BERT in Hyperbolic Spaces
Bilingual term extractor
Automated Essay Scoring using BERT
This is the code for loading the SenseBERT model, described in our paper from ACL 2020.
Hierarchical-Attention-Network
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Fast and memory-efficient library for WordPiece tokenization as it is used by BERT.
An easy-to-use Python module that helps you to extract the BERT embeddings for a large text dataset (Bengali/English) efficiently.
Add a description, image, and links to the bert-embeddings topic page so that developers can more easily learn about it.
To associate your repository with the bert-embeddings topic, visit your repo's landing page and select "manage topics."