BERT模型汇总#
下表汇总介绍了目前PaddleNLP支持的BERT模型对应预训练权重。 关于模型的具体细节可以参考对应链接。
Pretrained Weight |
Language |
Details of the model |
---|---|---|
|
English |
12-layer, 768-hidden, 12-heads, 110M parameters. Trained on lower-cased English text. |
|
English |
24-layer, 1024-hidden, 16-heads, 336M parameters. Trained on lower-cased English text. |
|
English |
12-layer, 768-hidden, 12-heads, 109M parameters. Trained on cased English text. |
|
English |
24-layer, 1024-hidden, 16-heads, 335M parameters. Trained on cased English text. |
|
Multilingual |
12-layer, 768-hidden, 12-heads, 168M parameters. Trained on lower-cased text in the top 102 languages with the largest Wikipedias. |
|
Multilingual |
12-layer, 768-hidden, 12-heads, 179M parameters. Trained on cased text in the top 104 languages with the largest Wikipedias. |
|
Chinese |
12-layer, 768-hidden, 12-heads, 108M parameters. Trained on cased Chinese Simplified and Traditional text. |
|
Chinese |
12-layer, 768-hidden, 12-heads, 108M parameters. Trained on cased Chinese Simplified and Traditional text using Whole-Word-Masking. |
|
Chinese |
12-layer, 768-hidden, 12-heads, 108M parameters. Trained on cased Chinese Simplified and Traditional text using Whole-Word-Masking with extented data. |
|
Chinese |
Please refer to: uer/chinese_roberta_L-12_H-768 |
|
Chinese |
Please refer to: uer/chinese_roberta_L-8_H-512 |
|
Chinese |
Please refer to: uer/chinese_roberta_L-4_H-512 |
|
Chinese |
Please refer to: uer/chinese_roberta_L-4_H-256 |
|
Chinese |
Please refer to: uer/chinese_roberta_L-2_H-128 |
|
Chinese |
Please refer to: uer/chinese_roberta_L-6_H-768 |
|
Chinese |
Please refer to: ckiplab/bert-base-chinese-pos |
|
English |
Please refer to: tbs17/MathBERT |
|
Chinese |
12-layer, 768-hidden, 12-heads, 102M parameters. Trained with novel MLM as correction pre-training task. |
|
Chinese |
24-layer, 1024-hidden, 16-heads, 326M parameters. Trained with novel MLM as correction pre-training task. |
|
Chinese |
12-layer, 768-hidden, 12-heads, 108M parameters. Trained on 22 million pairs of similar sentences crawed from Baidu Know. |
|
Chinese |
12-layer, 768-hidden, 12-heads, 102M parameters. Trained on 300G Chinese Corpus Datasets. |
|
Chinese |
12-layer, 768-hidden,
12-heads, 102M parameters.
Trained on 20G Finacial Corpus,
based on |
|
English |
Please refer to: cross-encoder/ms-marco-MiniLM-L-12-v2 |
|
Japanese |
Please refer to: cl-tohoku/bert-base-japanese-char |
|
Japanese |
Please refer to: cl-tohoku/bert-base-japanese-whole-word-masking |
|
Japanese |
Please refer to: cl-tohoku/bert-base-japanese |
|
Multilingual |
Please refer to: nlptown/bert-base-multilingual-uncased-sentiment |
|
English |
Please refer to: bert-large-uncased-whole-word-masking-finetuned-squad |
|
Spanish |
Please refer to: finiteautomata/beto-sentiment-analysis |
|
Chinese |
Please refer to: hfl/chinese-bert-wwm-ext |
|
English |
Please refer to: emilyalsentzer/Bio_ClinicalBERT |
|
English |
Please refer to: dslim/bert-base-NER |
|
English |
Please refer to: deepset/bert-large-uncased-whole-word-masking-squad2 |
|
Portuguese |
Please refer to: neuralmind/bert-base-portuguese-cased |
|
English |
Please refer to: SpanBERT/spanbert-large-cased |
|
English |
Please refer to: dslim/bert-large-NER |
|
German |
Please refer to: bert-base-german-cased |
|
English |
Please refer to: deepset/sentence_bert |
|
English |
Please refer to: ProsusAI/finbert |
|
German |
Please refer to: oliverguhr/german-sentiment-bert |
|
English |
Please refer to: google/bert_uncased_L-2_H-128_A-2 |
|
English |
Please refer to: microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract |
|
Russian |
Please refer to: DeepPavlov/rubert-base-cased |
|
Dutch |
Please refer to: wietsedv/bert-base-dutch-cased |
|
English |
Please refer to: monologg/bert-base-cased-goemotions-original |
|
English |
Please refer to: allenai/scibert_scivocab_uncased |
|
English |
Please refer to: dbmdz/bert-large-cased-finetuned-conll03-english |
|
English |
Please refer to: microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext |
|
English |
Please refer to: bert-large-uncased-whole-word-masking |
|
Spanish |
Please refer to: dccuchile/bert-base-spanish-wwm-uncased |
|
English |
Please refer to: google/bert_uncased_L-6_H-256_A-4 |
|
English |
Please refer to: google/bert_uncased_L-4_H-512_A-8 |
|
English |
Please refer to: FPTAI/vibert-base-cased |
|
Russian |
Please refer to: cointegrated/rubert-tiny |
|
German |
Please refer to: bert-base-german-dbmdz-uncased |
|
Turkish |
Please refer to: dbmdz/bert-base-turkish-128k-cased |
|
German |
Please refer to: dbmdz/bert-base-german-uncased |
|
English |
Please refer to: deepset/minilm-uncased-squad2 |
|
Persian |
Please refer to: HooshvareLab/bert-base-parsbert-uncased |
|
English |
Please refer to: textattack/bert-base-uncased-ag-news |
|
Japanese |
Please refer to: cl-tohoku/bert-base-japanese-v2 |
|
English |
Please refer to: emilyalsentzer/Bio_Discharge_Summary_BERT |
|
Japanese |
Please refer to: KoichiYasuoka/bert-base-japanese-upos |
|
Italian |
Please refer to: dbmdz/bert-base-italian-xxl-cased |
|
English |
Please refer to: deepset/bert-base-cased-squad2 |
|
English |
Please refer to: beomi/kcbert-large |
|
English |
Please refer to: bert-large-cased-whole-word-masking-finetuned-squad |
|
Portuguese |
Please refer to: neuralmind/bert-large-portuguese-cased |
|
English |
Please refer to: Luyu/co-condenser-marco |
|
German |
Please refer to: Sahajtomar/German_Zeroshot |
|
Indonesian |
Please refer to: indolem/indobert-base-uncased |
|
Chinese |
Please refer to: shibing624/text2vec-base-chinese |
|
English and Russian |
Please refer to: cointegrated/LaBSE-en-ru |
|
English |
Please refer to: prithivida/parrot_fluency_on_BERT |
|
English |
Please refer to: textattack/bert-base-uncased-SST-2 |
|
English |
Please refer to: textattack/bert-base-uncased-snli |
|
English |
Please refer to: klue/bert-base |
|
Arabic |
Please refer to: asafaya/bert-base-arabic |
|
English |
Please refer to: textattack/bert-base-uncased-MRPC |
|
English |
Please refer to: textattack/bert-base-uncased-imdb |
|
English |
Please refer to: cross-encoder/ms-marco-TinyBERT-L-2 |
|
English |
Please refer to: mrm8488/bert-tiny-finetuned-sms-spam-detection |
|
English |
Please refer to: felflare/bert-restore-punctuation |
|
English |
Please refer to: sshleifer/tiny-dbmdz-bert-large-cased-finetuned-conll03-english |
|
English |
Please refer to: textattack/bert-base-uncased-rotten-tomatoes |
|
English |
Please refer to: nlpaueb/legal-bert-base-uncased |
|
English |
Please refer to: hf-internal-testing/tiny-bert-for-token-classification |
|
Russian |
Please refer to: cointegrated/rubert-tiny2 |
|
Korean |
Please refer to: kykim/bert-kor-base |
|
Japanese |
Please refer to: cl-tohoku/bert-base-japanese-char-v2 |
|
English |
Please refer to: mrm8488/bert-small-finetuned-squadv2 |
|
English |
Please refer to: beomi/kcbert-base |
|
English |
Please refer to: textattack/bert-base-uncased-MNLI |
|
English |
Please refer to: textattack/bert-base-uncased-WNLI |
|
Turkish |
Please refer to: dbmdz/bert-base-turkish-cased |
|
English |
Please refer to: huawei-noah/TinyBERT_General_4L_312D |
|
English |
Please refer to: textattack/bert-base-uncased-QQP |
|
English |
Please refer to: textattack/bert-base-uncased-STS-B |
|
English |
Please refer to: allenai/scibert_scivocab_cased |
|
English |
Please refer to: mrm8488/bert-medium-finetuned-squadv2 |
|
Finnish |
Please refer to: TurkuNLP/bert-base-finnish-cased-v1 |
|
English |
Please refer to: textattack/bert-base-uncased-RTE |
|
Chinese |
Please refer to: uer/roberta-base-chinese-extractive-qa |
|
English |
Please refer to: textattack/bert-base-uncased-QNLI |
|
English |
Please refer to: textattack/bert-base-uncased-CoLA |
|
English |
Please refer to: dmis-lab/biobert-base-cased-v1.2 |
|
Portuguese |
Please refer to: pierreguillou/bert-base-cased-squad-v1.1-portuguese |
|
Swedish |
Please refer to: KB/bert-base-swedish-cased |
|
Chinese |
Please refer to: uer/roberta-base-finetuned-cluener2020-chinese |
|
Hebrew |
Please refer to: onlplab/alephbert-base |
|
Spanish |
Please refer to: mrm8488/bert-spanish-cased-finetuned-ner |
|
English |
Please refer to: alvaroalon2/biobert_chemical_ner |
|
English |
Please refer to: bert-base-cased-finetuned-mrpc |
|
English |
Please refer to: unitary/toxic-bert |
|
Greek |
Please refer to: nlpaueb/bert-base-greek-uncased-v1 |
|
Persian |
Please refer to: HooshvareLab/bert-fa-base-uncased-sentiment-snappfood |
|
Danish |
Please refer to: Maltehb/danish-bert-botxo |
|
English |
Please refer to: shahrukhx01/bert-mini-finetune-question-detection |
|
Dutch |
Please refer to: GroNLP/bert-base-dutch-cased |
|
English |
Please refer to: SpanBERT/spanbert-base-cased |
|
Italian |
Please refer to: dbmdz/bert-base-italian-uncased |
|
Germanh |
Please refer to: dbmdz/bert-base-german-cased |
|
Japanese |
Please refer to: cl-tohoku/bert-large-japanese |
|
Chinese |
Please refer to: hfl/chinese-bert-wwm |
|
Chinese |
Please refer to: hfl/chinese-macbert-large |
|
English |
Please refer to: dslim/bert-base-NER-uncased |
|
Multilingual |
Please refer to: amberoad/bert-multilingual-passage-reranking-msmarco |
|
Arabic |
Please refer to: aubmindlab/bert-base-arabertv02 |
|
English |
Please refer to: google/bert_uncased_L-4_H-256_A-4 |
|
Russian |
Please refer to: DeepPavlov/rubert-base-cased-conversational |
|
Spanish |
Please refer to: dccuchile/bert-base-spanish-wwm-cased |
|
Chinese |
Please refer to: ckiplab/bert-base-chinese-ws |
|
Japanese |
Please refer to: daigo/bert-base-japanese-sentiment |
|
Hungarian |
Please refer to: SZTAKI-HLT/hubert-base-cc |
|
English |
Please refer to: nlpaueb/legal-bert-small-uncased |
|
Romanian |
Please refer to: dumitrescustefan/bert-base-romanian-uncased-v1 |
|
Indian |
Please refer to: google/muril-base-cased |
|
Polish |
Please refer to: dkleczek/bert-base-polish-uncased-v1 |
|
Chinese |
Please refer to: ckiplab/bert-base-chinese-ner |
|
Turkish |
Please refer to: savasy/bert-base-turkish-sentiment-cased |
|
Spanish |
Please refer to: mrm8488/distill-bert-base-spanish-wwm-cased-finetuned-spa-squad2-es |
|
Swedish |
Please refer to: KB/bert-base-swedish-cased-ner |
|
Chinese |
Please refer to: hfl/rbt3 |
|
English |
Please refer to: remotejob/gradientclassification_v0 |
|
Spanish |
Please refer to: Recognai/bert-base-spanish-wwm-cased-xnli |
|
Persian |
Please refer to: HooshvareLab/bert-fa-zwnj-base |
|
English |
Please refer to: monologg/bert-base-cased-goemotions-group |
|
Russian |
Please refer to: blanchefort/rubert-base-cased-sentiment |
|
Chinese |
Please refer to: shibing624/macbert4csc-base-chinese |
|
English |
Please refer to: google/bert_uncased_L-8_H-512_A-8 |
|
English |
Please refer to: bert-large-cased-whole-word-masking |
|
English |
Please refer to: alvaroalon2/biobert_diseases_ner |
|
English |
Please refer to: philschmid/BERT-Banking77 |
|
Turkish |
Please refer to: dbmdz/bert-base-turkish-uncased |
|
English |
Please refer to: vblagoje/bert-english-uncased-finetuned-pos |
|
Romanian |
Please refer to: dumitrescustefan/bert-base-romanian-cased-v1 |
|
English |
Please refer to: nreimers/BERT-Tiny_L-2_H-128_A-2 |
|
English |
Please refer to: digitalepidemiologylab/covid-twitter-bert-v2 |
|
(DA) and MSA |
Please refer to: UBC-NLP/MARBERT |
|
Portuguese |
Please refer to: pierreguillou/bert-large-cased-squad-v1.1-portuguese |
|
English |
Please refer to: alvaroalon2/biobert_genetic_ner |
|
English |
Please refer to: bvanaken/clinical-assertion-negation-bert |
|
English |
Please refer to: cross-encoder/stsb-TinyBERT-L-4 |
|
English |
Please refer to: sshleifer/tiny-distilbert-base-cased |
|
Chinese |
Please refer to: ckiplab/bert-base-chinese |
|
English |
Please refer to: fabriceyhc/bert-base-uncased-amazon_polarity |