This code can be used to train RoBERTa on various tasks. It is designed to be easily extendable so new tasks can be added.
The currently supported tasks are:
- masked language modeling;
- gapped text (a custom pretraining task);
- extractive question answering.