model_utils#

class FasterPretrainedModel(*args, **kwargs)[源代码]#
to_static(output_path)[源代码]#

export generation model to static

参数:
  • path (str) -- path of saved inference model

  • config (dict) -- configuration for generation bos_token_id (int): token id of begin-of-sentence eos_token_id (int): token id of end-of-sentence pad_token_id (int): token id of pad token use_top_p (bool): whether use top_p decoding strategy

classmethod from_pretrained(pretrained_model_name_or_path, *args, **kwargs)[源代码]#

Creates an instance of PretrainedModel. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.

参数:
  • pretrained_model_name_or_path (str) --

    Name of pretrained model or dir path to load from. The string can be:

    • Name of a built-in pretrained model

    • Name of a community-contributed pretrained model.

    • Local directory path which contains model weights file("model_state.pdparams") and model config file ("model_config.json").

  • *args (tuple) -- Position arguments for model __init__. If provided, use these as position argument values for model initialization.

  • **kwargs (dict) -- Keyword arguments for model __init__. If provided, use these to update pre-defined keyword argument values for model initialization. If the keyword is in __init__ argument names of base model, update argument values of the base model; else update argument values of derived model.

返回:

An instance of PretrainedModel.

返回类型:

PretrainedModel

示例

from paddlenlp.transformers import BertForSequenceClassification

# Name of built-in pretrained model
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')

# Name of community-contributed pretrained model
model = BertForSequenceClassification.from_pretrained('yingyibiao/bert-base-uncased-sst-2-finetuned')

# Load from local directory path
model = BertForSequenceClassification.from_pretrained('./my_bert/')
save_pretrained(save_dir)[源代码]#

Saves model configuration and related resources (model state) as files under save_dir. The model configuration would be saved into a file named "model_config.json", and model state would be saved into a file named "model_state.pdparams".

The save_dir can be used in from_pretrained as argument value of pretrained_model_name_or_path to re-load the trained model.

参数:

save_dir (str) -- Directory to save files into.

示例

from paddlenlp.transformers import BertForSequenceClassification

model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
model.save_pretrained('./trained_model/')
# reload from save_directory
model = BertForSequenceClassification.from_pretrained('./trained_model/')
save_resources(save_directory)[源代码]#

Save tokenizer related resources to resource_files_names indicating files under save_directory by copying directly. Override it if necessary.

参数:

save_directory (str) -- Directory to save files into.