Arabic edition of BERT pretrained language models
-
Updated
Dec 5, 2020
Arabic edition of BERT pretrained language models
We use the BERT language model for Twitter sentiment analysis leading to the US 2020 presidential elections. We investigate if sentiment analysis can provide an indication of the outcome of the results using canonical LSTM and BERT language model.
Add a description, image, and links to the bert-language-models topic page so that developers can more easily learn about it.
To associate your repository with the bert-language-models topic, visit your repo's landing page and select "manage topics."