Skip to content

Dfine nano version learning rate config error #132

@qwuzer

Description

@qwuzer

## Our LR-Scheduler
flat_epoch: 7800 # 4 + epoch // 2, e.g., 40 = 4 + 72 / 2
no_aug_epoch: 12
lr_gamma: 1.0

In this nano version config, the flat epoch seems to be set wrong to 7800, which should be 78 instead. Also, why is lr_gamma set as 1.0, that is strange since wouldn't that cause the learning rate to be the same? Should I modify that to 0.5 so it match the other versions' base lr in

lr_gamma: 0.5
with 0.5?

Have been trying to train a DEIM model that works better on long tail objects for my machine learning course, if anyone could help answer my question here would be a great help, thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions