T5模型汇总#
下表汇总介绍了目前PaddleNLP支持的T5模型对应预训练权重。 关于模型的具体细节可以参考对应链接。
Pretrained Weight |
Language |
Details of the model |
---|---|---|
|
English |
6-layer, 512-hidden, 8-heads, 93M parameters. T5 small model. |
|
English |
12-layer, 768-hidden, 12-heads, 272M parameters. T5 base model. |
|
English |
24-layer, 1024-hidden, 16-heads, 803M parameters. T5 large model. |
|
English |
Please refer to: t5-v1_1-base |
|
English |
Please refer to: t5-v1_1-large |
|
Chinese |
Please refer to: Langboat/mengzi-t5-base |
|
Chinese |
Please refer to: Langboat/mengzi-t5-base-mt |
|
English |
Please refer to: deep-learning-analytics/wikihow-t5-small |
|
English |
Please refer to: sberbank-ai/ruT5-base |
|
English |
Please refer to: Michau/t5-base-en-generate-headline |
|
English |
Please refer to: google/t5-v1_1-small |
|
English |
Please refer to: prithivida/parrot_paraphraser_on_T5 |
|
English |
Please refer to: prithivida/grammar_error_correcter_v1 |
|
English |
Please refer to: valhalla/t5-small-qg-hl |
|
English |
Please refer to: valhalla/t5-small-qa-qg-hl |
|
English |
Please refer to: ramsrigouthamg/t5-large-paraphraser-diverse-high-quality |
|
English |
Please refer to: mrm8488/t5-base-finetuned-common_gen |
|
English |
Please refer to: valhalla/t5-small-e2e-qg |
|
japanese |
Please refer to: sonoisa/t5-base-japanese |
|
English |
Please refer to: google/t5-base-lm-adapt |
|
English |
Please refer to: google/t5-small-lm-adapt |
|
English |
Please refer to: valhalla/t5-small-qg-prepend |
|
English |
Please refer to: prithivida/informal_to_formal_styletransfer |
|
English |
Please refer to: KETI-AIR/ke-t5-base |
|
English |
Please refer to: nielsr/nt5-small-rc1 |
|
English |
Please refer to: snrspeaks/t5-one-line-summary |
|
English |
Please refer to: mrm8488/t5-small-finetuned-quora-for-paraphrasing |
|
English |
Please refer to: p-christ/12412fsasf |
|
English |
Please refer to: tscholak/3vnuv1vf |
|
English |
Please refer to: tennessejoyce/titlewave-t5-base |
|
English |
Please refer to: vennify/t5-base-grammar-correction |
|
Japanese |
Please refer to: megagonlabs/t5-base-japanese-web |
|
English |
Please refer to: sberbank-ai/ruT5-large |
|
English |
Please refer to: tscholak/t5.1.1.lm100k.base |
|
English |
Please refer to: deep-learning-analytics/GrammarCorrector |
|
English |
Please refer to: ThomasNLG/t5-qa_squad2neg-en |
|
English |
Please refer to: flexudy/t5-small-wav2vec2-grammar-fixer |
|
English |
Please refer to: KETI-AIR/ke-t5-small |
|
English |
Please refer to: razent/SciFive-large-Pubmed_PMC |
|
English |
Please refer to: google/t5-large-ssm-nq |
|
English |
Please refer to: ozcangundes/T5-base-for-BioQA |
|
English |
Please refer to: Rostlab/prot_t5_base_mt_uniref50 |
|
Japanese |
Please refer to: sonoisa/t5-base-japanese-question-generation |
|
English |
Please refer to: Wikidepia/IndoT5-base |
|
English |
Please refer to: razent/SciFive-base-Pubmed_PMC |
|
English |
Please refer to: google/t5-small-ssm-nq |