Skip to content

loss and lr not record on wandb #494

Open
@kf1111

Description

image

I attempted to record the loss and learning rate of my lora learning, but only GPU information was recorded. My config.toml file contains the following settings:

log_with = "wandb"
log_tracker_name = "lora_0511"
wandb_api_key = "apikey"

pretrained_model_name_or_path = "....ckpt"
train_data_dir = "..."

shuffle_caption = true
caption_extension = ".txt"
keep_tokens = 20
resolution = "768"
vae_batch_size = 4
enable_bucket = true
output_dir = "..."
output_name = "..."
save_precision = "fp16"
save_every_n_epochs = 10

train_batch_size = 2
gradient_checkpointing = true
gradient_accumulation_steps = 64

max_token_length = 150
xformers = true
max_train_epochs = 50
persistent_data_loader_workers = true
seed = 42
mixed_precision = "bf16"
clip_skip = 2

multires_noise_iterations = 6
multires_noise_discount = 0.1

flip_aug = true
use_8bit_adam = true
lr_scheduler = "cosine_with_restarts"
lr_warmup_steps = 12
lr_scheduler_num_cycles = 10
unet_lr = 0.0004
text_encoder_lr = 0.0002
network_module = "networks.lora"
network_dim = 64
network_alpha = 32.0

#428
I read this page, and know it's ok to ignore "logging_dir"

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions