Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve DINO #10809

Merged
merged 2 commits into from
Aug 21, 2023
Merged

Improve DINO #10809

merged 2 commits into from
Aug 21, 2023

Conversation

hhaAndroid
Copy link
Collaborator

@hhaAndroid hhaAndroid commented Aug 18, 2023

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Backbone Model Lr schd Better-Hyper box AP Config Download
R-50 DINO-4scale 12e False 49.0 config model | log
R-50 DINO-4scale 12e True 50.1 config model | log

@hhaAndroid hhaAndroid merged commit b4f62a4 into open-mmlab:dev-3.x Aug 21, 2023
yumion pushed a commit to yumion/mmdetection that referenced this pull request Jan 31, 2024
yumion pushed a commit to yumion/mmdetection that referenced this pull request Jan 31, 2024
@SyedShaQutub
Copy link

I recently discovered a significant modification in the DINO-DETR code within the mmdetection repository. Specifically, I noticed that the self-attention section of both the encoder and decoder transformers only includes normalization in the code, without the addition of identity after the multi-head attention block. Is there a particular reason for this substantial change in the code?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants