-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of intensity clipping transform: bot hard and soft approaches #7535
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
… soft clipping approaches Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
KumoLiu
requested review from
atbenmurray,
ericspod,
Nic-Ma,
KumoLiu and
dongyang0122
March 18, 2024 03:17
KumoLiu
approved these changes
Mar 18, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR, overall looks good to me.
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
… percentile-clipper
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
for more information, see https://pre-commit.ci
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
… percentile-clipper
dongyang0122
reviewed
Apr 4, 2024
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
dongyang0122
approved these changes
Apr 4, 2024
dongyang0122
reviewed
Apr 4, 2024
…ames for ClipIntensityPercentiles class Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
for more information, see https://pre-commit.ci
…lues Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
/build |
2 similar comments
/build |
/build |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes Issue #7512.
Description
Addition of a transformation allowing values above or below a certain percentile to be clipped.
Clipping can be hard or soft.
With soft clipping, the function remains derivable and the order of the values is respected, with smoother corners.
The soft clipping function is based on this medium article https://medium.com/life-at-hopper/clip-it-clip-it-good-1f1bf711b291
It's important to note that I've chosen to switch from Nones values to percentiles to take account of the fact that soft clipping can be one-sided or two-sided.
In fact, providing percentiles of 100 or 0 doesn't change anything in the case of hard clipping, but it does in the case of soft clipping because the function is smoothed. Hence the interest in introducing the possibility of putting None to avoid smoothing the function on one side or the other.
To implement this we had to define a
softplus
function inmonai.transforms.utils_pytorch_numpy_unification.py
. One of the problems is thatnp.logaddexp
do not exactly yields same outputs astorch.logaddexp
. I've left it as is and lowered the tolerance of the tests slightly, but it's possible to force the conversion to numpy and then switch back to torch to ensure better unification between the frameworks.I've also added the
soft_clip
function inmonai.transforms.utils.py
with the associated unit tests to ensure that the transformation works properly.Types of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.