Description
│ /usr/lib/python3/dist-packages/llmfoundry/models/hf/hf_causal_lm.py:119 in │ │ __init__ │ │ │ │ 116 │ │ ] │ │ 117 │ │ │ │ 118 │ │ # Construct the Hugging Face config to use │ │ ❱ 119 │ │ config = AutoConfig.from_pretrained( │ │ 120 │ │ │ pretrained_model_name_or_path, │ │ 121 │ │ │ trust_remote_code=trust_remote_code, │ │ 122 │ │ │ use_auth_token=use_auth_token, ValueError: The checkpoint you are trying to load has model type
dbrx but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
This is the issue I faced when trying to run dbrx_lora_ft.yaml. Any suggestions would be appreciated!