-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Labels
fixedFixed!Fixed!
Description
Hello there, when using the Google Colab. I reached this step:
from trl import SFTTrainer
from transformers import TrainingArguments, DataCollatorForSeq2Seq
from unsloth import is_bfloat16_supported
trainer = SFTTrainer(
model = model,
tokenizer = tokenizer,
train_dataset = dataset,
dataset_text_field = 'text',
max_seq_length = max_seq_length,
data_collator = DataCollatorForSeq2Seq(tokenizer = tokenizer),
dataset_num_proc = 2,
packing = False, # Can make training 5x faster for short sequences.
args = TrainingArguments(
per_device_train_batch_size = 2,
gradient_accumulation_steps = 4,
warmup_steps = 5,
# num_train_epochs = 1, # Set this for 1 full training run.
max_steps = 60,
learning_rate = 2e-4,
fp16 = not is_bfloat16_supported(),
bf16 = is_bfloat16_supported(),
logging_steps = 1,
optim = "adamw_8bit",
weight_decay = 0.01,
lr_scheduler_type = "linear",
seed = 3407,
output_dir = "outputs",
report_to = "none", # Use this for WandB etc
),
)
However, I get the following error:
TypeError: SFTTrainer.__init__() got an unexpected keyword argument 'dataset_text_field'
. Is there any method of fixing this issue?
steveepreston, RiddleHe, yg-smile, yifan0011, EdGaere and 1 more
Metadata
Metadata
Assignees
Labels
fixedFixed!Fixed!