Noise144 commited on
Commit
48fa847
·
verified ·
1 Parent(s): db21f60

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -3
README.md CHANGED
@@ -55,9 +55,6 @@ Fine-tuning is performed according to the following parameters:
55
 
56
  ## Model description
57
 
58
- XLM-RoBERTa model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
59
- It was introduced in the paper Unsupervised Cross-lingual Representation Learning at Scale by Conneau et al. and first released in this repository.
60
-
61
  XLM-RoBERTa is a multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
62
  RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion.
63
  This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
 
55
 
56
  ## Model description
57
 
 
 
 
58
  XLM-RoBERTa is a multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
59
  RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion.
60
  This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.