-
Notifications
You must be signed in to change notification settings - Fork 186
Open
Description
DEIM/configs/deim_dfine/deim_hgnetv2_n_coco.yml
Lines 29 to 32 in 8f28fe6
| ## Our LR-Scheduler | |
| flat_epoch: 7800 # 4 + epoch // 2, e.g., 40 = 4 + 72 / 2 | |
| no_aug_epoch: 12 | |
| lr_gamma: 1.0 |
In this nano version config, the flat epoch seems to be set wrong to 7800, which should be 78 instead. Also, why is lr_gamma set as 1.0, that is strange since wouldn't that cause the learning rate to be the same? Should I modify that to 0.5 so it match the other versions' base lr in
Line 39 in 8f28fe6
| lr_gamma: 0.5 |
Have been trying to train a DEIM model that works better on long tail objects for my machine learning course, if anyone could help answer my question here would be a great help, thanks!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels