RoBERTa模型汇总¶
下表汇总介绍了目前PaddleNLP支持的RoBERTa模型对应预训练权重。 关于模型的具体细节可以参考对应链接。
Pretrained Weight |
Language |
Details of the model |
---|---|---|
|
Chinese |
12-layer, 768-hidden, 12-heads, 102M parameters. Trained on English Text using Whole-Word-Masking with extended data. |
|
Chinese |
24-layer, 1024-hidden, 16-heads, 325M parameters. Trained on English Text using Whole-Word-Masking with extended data. |
|
Chinese |
3-layer, 768-hidden, 12-heads, 38M parameters. |
|
Chinese |
3-layer, 1024-hidden, 16-heads, 61M parameters. |
|
Chinese |
4-layer, 768-hidden, 12-heads, 47M parameters. |
|
Chinese |
6-layer, 768-hidden, 12-heads, 60M parameters. |
|
English |
12-layer, 768-hidden, 12-heads, 124M parameters. Trained on English text. Please refer to: deepset/roberta-base-squad2 |
|
Chinese |
12-layer, 768-hidden, 12-heads, 101M parameters. Trained on Chinese text. Please refer to: uer/roberta-base-chinese-extractive-qa |
|
Chinese |
12-layer, 768-hidden, 12-heads, 102M parameters. Trained on Chinese text. Please refer to: uer/roberta-base-finetuned-chinanews-chinese |
|
Chinese |
12-layer, 768-hidden, 12-heads, 101M parameters. Trained on Chinese text. Please refer to: uer/roberta-base-finetuned-cluener2020-chinese |
|
English |
Please refer to: roberta-base |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base-sentiment |
|
English |
Please refer to: roberta-large |
|
English |
Please refer to: distilroberta-base |
|
English |
Please refer to: cross-encoder/nli-distilroberta-base |
|
English |
Please refer to: siebert/sentiment-roberta-large-english |
|
English |
Please refer to: j-hartmann/emotion-english-distilroberta-base |
|
English |
Please refer to: roberta-base-openai-detector |
|
English |
Please refer to: huggingface/CodeBERTa-small-v1 |
|
English |
Please refer to: mrm8488/distilroberta-finetuned-financial-news-sentiment-analysis |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base-emotion |
|
English |
Please refer to: seyonec/PubChem10M_SMILES_BPE_396_250 |
|
English |
Please refer to: textattack/roberta-base-SST-2 |
|
English |
Please refer to: sshleifer/tiny-distilroberta-base |
|
English |
Please refer to: thatdramebaazguy/roberta-base-squad |
|
English |
Please refer to: ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli |
|
Czech |
Please refer to: ufal/robeczech-base |
|
English |
Please refer to: seyonec/PubChem10M_SMILES_BPE_450k |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base |
|
English |
Please refer to: seyonec/PubChem10M_SMILES_BPE_50k |
|
English |
Please refer to: microsoft/codebert-base-mlm |
|
English |
Please refer to: textattack/roberta-base-MNLI |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base-offensive |
|
English |
Please refer to: cross-encoder/stsb-roberta-large |
|
English |
Please refer to: seyonec/ChemBERTa_zinc250k_v2_40k |
|
German |
Please refer to: uklfr/gottbert-base |
|
English |
Please refer to: seyonec/ChemBERTa-zinc-base-v1 |
|
English |
Please refer to: roberta-large-openai-detector |
|
English |
Please refer to: cross-encoder/quora-roberta-base |
|
English |
Please refer to: cross-encoder/stsb-roberta-base |
|
English |
Please refer to: microsoft/graphcodebert-base |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base-hate |
|
English |
Please refer to: chkla/roberta-argument |
|
English |
Please refer to: Salesforce/grappa_large_jnt |
|
English |
Please refer to: vinai/bertweet-large |
|
English |
Please refer to: allenai/biomed_roberta_base |
|
English |
Please refer to: facebook/muppet-roberta-base |
|
English |
Please refer to: Rakib/roberta-base-on-cuad |
|
English |
Please refer to: cross-encoder/stsb-distilroberta-base |
|
English |
Please refer to: nyu-mll/roberta-base-1B-1 |
|
English |
Please refer to: nyu-mll/roberta-med-small-1M-1 |
|
English |
Please refer to: SkolkovoInstitute/roberta_toxicity_classifier |
|
English |
Please refer to: facebook/muppet-roberta-large |
|
Korean |
Please refer to: lassl/roberta-ko-small |
|
English |
Please refer to: huggingface/CodeBERTa-language-id |
|
English |
Please refer to: textattack/roberta-base-imdb |
|
Macedonian |
Please refer to: macedonizer/mk-roberta-base |
|
English |
Please refer to: cross-encoder/nli-MiniLM2-L6-H768 |
|
English |
Please refer to: textattack/roberta-base-QNLI |
|
English |
Please refer to: deepset/roberta-base-squad2-covid |
|
English |
Please refer to: textattack/roberta-base-MRPC |
|
English |
Please refer to: bhadresh-savani/roberta-base-emotion |
|
English |
Please refer to: aychang/roberta-base-imdb |
|
English |
Please refer to: cross-encoder/quora-distilroberta-base |
|
English |
Please refer to: csarron/roberta-base-squad-v1 |
|
English |
Please refer to: seyonec/ChemBERTA_PubChem1M_shard00_155k |
|
English |
Please refer to: mental/mental-roberta-base |
|
English |
Please refer to: textattack/roberta-base-CoLA |
|
English |
Please refer to: navteca/quora-roberta-base |
|
English |
Please refer to: cardiffnlp/twitter-roberta-base-emoji |
|
Multilingual |
Please refer to: benjamin/roberta-base-wechsel-german |
|
English |
Please refer to: textattack/roberta-base-ag-news |
|
English |
Please refer to: johngiorgi/declutr-base |
|
English |
Please refer to: salesken/query_wellformedness_score |
|
Russian |
Please refer to: blinoff/roberta-base-russian-v0 |
|
English |
Please refer to: allenai/reviews_roberta_base |
|
English |
Please refer to: ruiqi-zhong/roberta-base-meta-tuning-test |
|
English |
Please refer to: mrm8488/distilroberta-finetuned-tweets-hate-speech |
|
English |
Please refer to: cointegrated/roberta-large-cola-krishna2020 |
|
English |
Please refer to: deepset/roberta-base-squad2-distilled |
|
English |
Please refer to: tli8hf/unqover-roberta-base-squad |
|
English |
Please refer to: cross-encoder/nli-roberta-base |
|
English |
Please refer to: nreimers/MiniLMv2-L6-H384-distilled-from-RoBERTa-Large |
|
English |
Please refer to: seyonec/BPE_SELFIES_PubChem_shard00_160k |
|
Dutch |
Please refer to: CLTL/MedRoBERTa.nl |
|
Persian |
Please refer to: HooshvareLab/roberta-fa-zwnj-base |
|
English |
Please refer to: nyu-mll/roberta-base-100M-1 |
|
English |
Please refer to: deepset/tinyroberta-squad2 |
|
Ukrainian |
Please refer to: youscan/ukr-roberta-base |
|
English |
Please refer to: navteca/roberta-base-squad2 |
|
Spanish |
Please refer to: bertin-project/bertin-roberta-base-spanish |
|
English |
Please refer to: shiyue/roberta-large-tac08 |
|
Catalan |
Please refer to: softcatala/julibert |
|
English |
Please refer to: elozano/tweet_sentiment_eval |
|
Indonesian |
Please refer to: cahya/roberta-base-indonesian-1.5G |
|
English |
Please refer to: elozano/tweet_emotion_eval |
|
English |
Please refer to: navteca/roberta-large-squad2 |
|
English |
Please refer to: elozano/tweet_offensive_eval |
|
English |
Please refer to: ynie/roberta-large_conv_contradiction_detector_v0 |