model_utils#

class FasterPretrainedModel(*args, **kwargs)[source]#
to_static(output_path)[source]#

export generation model to static

Parameters:
  • path (str) – path of saved inference model

  • config (dict) – configuration for generation bos_token_id (int): token id of begin-of-sentence eos_token_id (int): token id of end-of-sentence pad_token_id (int): token id of pad token use_top_p (bool): whether use top_p decoding strategy

classmethod from_pretrained(pretrained_model_name_or_path, *args, **kwargs)[source]#

Creates an instance of PretrainedModel. Model weights are loaded by specifying name of a built-in pretrained model, or a community contributed model, or a local file directory path.

Parameters:
  • pretrained_model_name_or_path (str) –

    Name of pretrained model or dir path to load from. The string can be:

    • Name of a built-in pretrained model

    • Name of a community-contributed pretrained model.

    • Local directory path which contains model weights file(“model_state.pdparams”) and model config file (“model_config.json”).

  • *args (tuple) – Position arguments for model __init__. If provided, use these as position argument values for model initialization.

  • **kwargs (dict) – Keyword arguments for model __init__. If provided, use these to update pre-defined keyword argument values for model initialization. If the keyword is in __init__ argument names of base model, update argument values of the base model; else update argument values of derived model.

Returns:

An instance of PretrainedModel.

Return type:

PretrainedModel

Example

from paddlenlp.transformers import BertForSequenceClassification

# Name of built-in pretrained model
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')

# Name of community-contributed pretrained model
model = BertForSequenceClassification.from_pretrained('yingyibiao/bert-base-uncased-sst-2-finetuned')

# Load from local directory path
model = BertForSequenceClassification.from_pretrained('./my_bert/')
save_pretrained(save_dir)[source]#

Saves model configuration and related resources (model state) as files under save_dir. The model configuration would be saved into a file named “model_config.json”, and model state would be saved into a file named “model_state.pdparams”.

The save_dir can be used in from_pretrained as argument value of pretrained_model_name_or_path to re-load the trained model.

Parameters:

save_dir (str) – Directory to save files into.

Example

from paddlenlp.transformers import BertForSequenceClassification

model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
model.save_pretrained('./trained_model/')
# reload from save_directory
model = BertForSequenceClassification.from_pretrained('./trained_model/')
save_resources(save_directory)[source]#

Save tokenizer related resources to resource_files_names indicating files under save_directory by copying directly. Override it if necessary.

Parameters:

save_directory (str) – Directory to save files into.