You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when I changed the torch_dtype of the loading function from torch.bfloat16 to torch.float16,
which is
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True, device_map="sequential", torch_dtype=torch.float16,
The inference wont work. Activation will contain Nan. Is this a known issue?
env: A100*8; transformers Version: 4.44.0
The text was updated successfully, but these errors were encountered:
when I changed the torch_dtype of the loading function from torch.bfloat16 to torch.float16,
which is
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True, device_map="sequential", torch_dtype=torch.float16,
The inference wont work. Activation will contain Nan. Is this a known issue?
env: A100*8; transformers Version: 4.44.0
The text was updated successfully, but these errors were encountered: