modeling#
- class AutoBackbone(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoBackbone.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoBackbone
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoBackbone
.- Return type:
Example
from paddlenlp.transformers import AutoBackbone # Name of built-in pretrained model model = AutoBackbone.from_pretrained("google/bit-50") print(type(model)) # <class 'paddlenlp.transformers.bit.modeling.BitBackbone'> # Load from local directory path model = AutoBackbone.from_pretrained("./bit-50") print(type(model)) # <class 'paddlenlp.transformers.bit.modeling.BitBackbone'>
- class AutoModel(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoClass can help you automatically retrieve the relevant model given the provided pretrained weights/vocabulary. AutoModel is a generic model class that will be instantiated as one of the base model classes when created with the from_pretrained() classmethod.
- classmethod from_pretrained(pretrained_model_name_or_path, task=None, *model_args, **kwargs)[source]#
Creates an instance of
AutoModel
. Model weights are loaded by specifying name of a built-in pretrained model, a pretrained model on HF, a community contributed model, or a local file directory path.- Parameters:
pretrained_model_name_or_path (str) –
Name of pretrained model or dir path to load from. The string can be:
Name of a built-in pretrained model
Name of a community-contributed pretrained model.
Local directory path which contains model weights file(“model_state.pdparams”) and model config file (“model_config.json”).
task (str) – Specify a downstream task. Task can be ‘Model’, ‘ForPretraining’, ‘ForSequenceClassification’, ‘ForTokenClassification’, ‘ForQuestionAnswering’, ‘ForMultipleChoice’, ‘ForMaskedLM’, ‘ForCausalLM’, ‘Encoder’, ‘Decoder’, ‘Generator’, ‘Discriminator’, ‘ForConditionalGeneration’. We only support specify downstream tasks in AutoModel. Defaults to
None
.*args (tuple) – Position arguments for model
__init__
. If provided, use these as position argument values for model initialization.**kwargs (dict) – Keyword arguments for model
__init__
. If provided, use these to update pre-defined keyword argument values for model initialization. If the keyword is in__init__
argument names of base model, update argument values of the base model; else update argument values of derived model.
- Returns:
An instance of
AutoModel
.- Return type:
Example
from paddlenlp.transformers import AutoModel # Name of built-in pretrained model model = AutoModel.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModel'> # Name of community-contributed pretrained model model = AutoModel.from_pretrained('yingyibiao/bert-base-uncased-sst-2-finetuned') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModel'> # Load from local directory path model = AutoModel.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModel'> # choose task model = AutoModel.from_pretrained('bert-base-uncased', task='ForPretraining') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertForPretraining'>
- class AutoModelForPretraining(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForPretraining.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForPretraining
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForPretraining
.- Return type:
Example
from paddlenlp.transformers import AutoModelForPretraining # Name of built-in pretrained model model = AutoModelForPretraining.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForPretraining'> # Name of community-contributed pretrained model model = AutoModelForPretraining.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForPretraining'> # Load from local directory path model = AutoModelForPretraining.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForPretraining'>
- class AutoModelForSequenceClassification(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForSequenceClassification.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForSequenceClassification
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForSequenceClassification
.- Return type:
Example
from paddlenlp.transformers import AutoModelForSequenceClassification # Name of built-in pretrained model model = AutoModelForSequenceClassification.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForSequenceClassification'> # Name of community-contributed pretrained model model = AutoModelForSequenceClassification.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForSequenceClassification'> # Load from local directory path model = AutoModelForSequenceClassification.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForSequenceClassification'>
- class AutoModelForTokenClassification(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForTokenClassification.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForTokenClassification
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForTokenClassification
.- Return type:
Example
from paddlenlp.transformers import AutoModelForTokenClassification # Name of built-in pretrained model model = AutoModelForTokenClassification.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForTokenClassification'> # Name of community-contributed pretrained model model = AutoModelForTokenClassification.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForTokenClassification'> # Load from local directory path model = AutoModelForTokenClassification.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForTokenClassification'>
- class AutoModelForQuestionAnswering(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForQuestionAnswering.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForQuestionAnswering
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForQuestionAnswering
.- Return type:
Example
from paddlenlp.transformers import AutoModelForQuestionAnswering # Name of built-in pretrained model model = AutoModelForQuestionAnswering.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForQuestionAnswering'> # Name of community-contributed pretrained model model = AutoModelForQuestionAnswering.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForQuestionAnswering'> # Load from local directory path model = AutoModelForQuestionAnswering.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForQuestionAnswering'>
- class AutoModelForMultipleChoice(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForMultipleChoice.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForMultipleChoice
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForMultipleChoice
.- Return type:
Example
from paddlenlp.transformers import AutoModelForMultipleChoice # Name of built-in pretrained model model = AutoModelForMultipleChoice.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMultipleChoice'> # Name of community-contributed pretrained model model = AutoModelForMultipleChoice.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMultipleChoice'> # Load from local directory path model = AutoModelForMultipleChoice.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMultipleChoice'>
- class AutoModelForMaskedLM(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForMaskedLM.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForMaskedLM
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForMaskedLM
.- Return type:
Example
from paddlenlp.transformers import AutoModelForMaskedLM # Name of built-in pretrained model model = AutoModelForMaskedLM.from_pretrained('bert-base-uncased') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMaskedLM'> # Name of community-contributed pretrained model model = AutoModelForMaskedLM.from_pretrained('iverxin/bert-base-japanese') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMaskedLM'> # Load from local directory path model = AutoModelForMaskedLM.from_pretrained('./my_bert/') print(type(model)) # <class 'paddlenlp.transformers.bert.modeling.BertModelForMaskedLM'>
- class AutoModelForCausalLM(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForCausalLM.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForCausalLM
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForCausalLM
.- Return type:
Example
from paddlenlp.transformers import AutoModelForCausalLM # Name of built-in pretrained model model = AutoModelForCausalLM.from_pretrained('gpt2-en') print(type(model)) # <class 'paddlenlp.transformers.gpt.modeling.GPTLMHeadModel'> # Name of community-contributed pretrained model model = AutoModelForCausalLM.from_pretrained('junnyu/distilgpt2') print(type(model)) # <class 'paddlenlp.transformers.gpt.modeling.GPTLMHeadModel'> # Load from local directory path model = AutoModelForCausalLM.from_pretrained('./my_gpt/') print(type(model)) # <class 'paddlenlp.transformers.gpt.modeling.GPTLMHeadModel'>
- class AutoModelForCausalLMPipe(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
Pipeline model for AutoModelForCausalLM.
- class AutoEncoder(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoEncoder.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoEncoder
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoEncoder
.- Return type:
Example
from paddlenlp.transformers import AutoEncoder # Name of built-in pretrained model model = AutoEncoder.from_pretrained('bart-base',vocab_size=20000) print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartEncoder'> # Load from local directory path model = AutoEncoder.from_pretrained('./my_bart/') print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartEncoder'>
- class AutoDecoder(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoDecoder.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoDecoder
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoDecoder
.- Return type:
Example
from paddlenlp.transformers import AutoDecoder # Name of built-in pretrained model model = AutoDecoder.from_pretrained('bart-base', vocab_size=20000) print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartEncoder'> # Load from local directory path model = AutoDecoder.from_pretrained('./my_bart/') print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartEncoder'>
- class AutoGenerator(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoGenerator.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoGenerator
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoGenerator
.- Return type:
Example
from paddlenlp.transformers import AutoGenerator # Name of built-in pretrained model model = AutoGenerator.from_pretrained('electra-small') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraGenerator'> # Name of community-contributed pretrained model model = AutoGenerator.from_pretrained('junnyu/hfl-chinese-legal-electra-small-generator') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraGenerator'> # Load from local directory path model = AutoGenerator.from_pretrained('./my_electra/') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraGenerator'>
- class AutoDiscriminator(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoDiscriminator.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoDiscriminator
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoDiscriminator
.- Return type:
Example
from paddlenlp.transformers import AutoDiscriminator # Name of built-in pretrained model model = AutoDiscriminator.from_pretrained('electra-small') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraDiscriminator'> # Name of community-contributed pretrained model model = AutoDiscriminator.from_pretrained('junnyu/hfl-chinese-legal-electra-small-generator') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraDiscriminator'> # Load from local directory path model = AutoDiscriminator.from_pretrained('./my_electra/') print(type(model)) # <class 'paddlenlp.transformers.electra.modeling.ElectraDiscriminator'>
- class AutoModelForConditionalGeneration(*args, **kwargs)[source]#
Bases:
_BaseAutoModelClass
AutoModelForConditionalGeneration.
- classmethod from_pretrained(pretrained_model_name_or_path, *model_args, **kwargs)[source]#
Creates an instance of
AutoModelForConditionalGeneration
. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.- Parameters:
- Returns:
An instance of
AutoModelForConditionalGeneration
.- Return type:
Example
from paddlenlp.transformers import AutoModelForConditionalGeneration # Name of built-in pretrained model model = AutoModelForConditionalGeneration.from_pretrained('bart-base') print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartForConditionalGeneration'> # Load from local directory path model = AutoModelForConditionalGeneration.from_pretrained('./my_bart/') print(type(model)) # <class 'paddlenlp.transformers.bart.modeling.BartForConditionalGeneration'>