Skip to content

GPT-2 models are unpickable #4038

Closed
@Lawiss

Description

@Lawiss

🐛 Bug

Information

Model I am using (Bert, XLNet ...): GPT-2

Language I am using the model on (English, Chinese ...): English

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts:

Hi,

I'm trying to train a GPT-2 Double Heads Model (based on your transfer-learning-conv-ai guide) using Pytorch Lightning. However I have a problem when trying to train the model on ddp distributed backend : the GPT2DoubleHeadsModel class seems to be unpickable and my training script fails with the following error :
TypeError: can't pickle torch._C.ScriptFunction objects

To reproduce

Run :

from transformers import GPT2DoubleHeadsModel
import pickle
model = GPT2DoubleHeadsModel.from_pretrained("gpt2-medium")
pickle.dump(model,open("test.bin","wb"))

The problem does not occur when using bert-base-uncased for example. I tried to search which part of GPT-2 class contains torch._C.ScriptFunction objects without success. Do you have an idea to avoid this error ?

Thanks in advance.

  • transformers version: 2.8
  • Platform: Ubuntu 18.04
  • Python version: 3.7
  • PyTorch version (GPU?): 1.5 Cuda 10.2
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: Yes, ddp

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions