Skip to content

OlegPlatonov/roberta-training

Repository files navigation

This code can be used to train RoBERTa on various tasks. It is designed to be easily extendable so new tasks can be added.

The currently supported tasks are:

  • masked language modeling;
  • gapped text (a custom pretraining task);
  • extractive question answering.

About

Training RoBERTa on various tasks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published