Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] fix attention clamp max params #1034

Merged
merged 1 commit into from
Sep 26, 2022
Merged

Conversation

yingfhu
Copy link
Collaborator

@yingfhu yingfhu commented Sep 14, 2022

Motivation

The original clamp operation is hard to read.

Modification

Use np.log instead for better compatibilities for all pytorch version

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects, like MMDet or MMSeg.
  • CLA has been signed and all committers have signed the CLA in this PR.

@yingfhu yingfhu requested a review from mzr1996 September 14, 2022 07:51
@codecov
Copy link

codecov bot commented Sep 14, 2022

Codecov Report

Base: 86.14% // Head: 86.14% // No change to project coverage 👍

Coverage data is based on head (2ba3e89) compared to base (0b4a67d).
Patch has no changes to coverable lines.

Additional details and impacted files
@@           Coverage Diff           @@
##              dev    #1034   +/-   ##
=======================================
  Coverage   86.14%   86.14%           
=======================================
  Files         140      140           
  Lines        9678     9678           
  Branches     1675     1675           
=======================================
  Hits         8337     8337           
  Misses       1090     1090           
  Partials      251      251           
Flag Coverage Δ
unittests 86.07% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmcls/models/utils/attention.py 96.91% <ø> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@mzr1996 mzr1996 merged commit 6ebb3f7 into open-mmlab:dev Sep 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants