You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
help="checkpoint directory. Use with a sharded checkpoint, not for the standard llama2 model. Note, checkpoint_dir takes precedence over checkpoint if both are set.",
236
236
)
237
237
238
+
parser.add_argument(
239
+
"--adapter_checkpoint",
240
+
required=False,
241
+
help="Path to the adapter.pt file from torchtune. Used if the model has trained LoRA adapters. Must provide adapter_config.json",
242
+
)
243
+
244
+
parser.add_argument(
245
+
"--adapter_config",
246
+
required=False,
247
+
help="Path to the adapter_config.json file. Used if the model has trained LoRA adapters. Must provide adapter_checkpoint.",
0 commit comments