-
Notifications
You must be signed in to change notification settings - Fork 5
Mask the background out of loaded MRIs (#387) #391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
sadhana-r
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code design looks good. Very cool to see the volume masked!
My understanding from the meeting yesterday was that Peter wanted the volume to get masked when the skin segmentation is computed and displayed during virtual fit (or tracking) since it was anyways computing the foreground mask. That way the computation does get repeated when loading and then running those pipelines. But this works too! The loading is a little slower but not too annoying.
One issue I found with the current approach is that the volume_array is type uint16, so when the minimum value in the image is 0, you are setting the background to -1 which isn't in the allowable range. So the background gets set to 65535 and shows up as white (when loading volume petra in OW2_001. )

One fix I tested is to instead set the background to volume_array_max + 1 which should hopefully not be greater than 65535 but it might be better to include some error handling for those edge cases or change the data type?
|
Oh wow, glad you tried those examples! I will think about how to deal with the integer overflow. Maybe always forcing float type
Yeah I think your understanding of the meeting is correct. I am cheating because I didn't want to add so much logic for sharing the mask result. And then if someone saves the session after pre-planning and then loads again later then the mask would be gone, unless we also add logic for writing it to the database somehow. This seemed like a good tradeoff. I wonder if we can speed up the |
Hmmm true. Related to this, if a user loads a session where tracking or planning has already been run, the skin-segmentation is not shown unless the user re-runs virtual fit or goes through and 're-approves' the tracking result. This isn't ideal but I left it as is. |
Yes, I understood it work that way. I think it is okay -- peter can request a different behavior later if he wants. It will require core openlifu changes to support storing the skin segementation, unless we go with this idea:
this is a good idea. I still think we can hold off and get some feedback. It might be fine the way it is. Showing the skin segmentation in the main scene is mainly for the cool factor I think, and not something functional yet |
|
@sadhana-r Ready for another look. I am planning to hold off on merging until some of your PRs get merged though, so no rush. Thanks for the review! |
| volume_array = slicer.util.arrayFromVolume(volume_node) | ||
| volume_array_min = volume_array.min() | ||
|
|
||
| background_value = volume_array_min - 1 | ||
| if background_value < volume_node.GetImageData().GetScalarTypeMin(): # e.g. if volume_array_min is 0 and it's an unsigned int type | ||
| logging.info("Casting volume to float for the sake of `threshold_volume_by_foreground_mask`.") | ||
| cast_volume_to_float(volume_node) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is how I addressed the problem you pointed out: if the value I am trying to use for background (e.g. -1) is less than the minimum for the data type (e.g. 0 for unsigned integer types), then we cast to float before proceeding.
sadhana-r
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me!
|
I will add a commit to this to also compute skin segmentation on session load, only if the session has a virtual fit result. |
Closes #387 One thing I slightly dislike about this method is that it modifies the values in the volume array: it makes the background voxels take a value that is 1 less than the minimum value in the volume. This way they can simply be thresholded out. There should be other ways to do this that involve creating a segmentation node out of the foreground mask and then using it as a "cropping ROI" for the volume. This would be nicer but then it's another piece of data to manage. This method keeps things simple, and it doesn't seem to harm the downstream skin segmentation computation. Another thing I slightly dislike about this method is that it takes some time to compute the foreground mask, so this makes loading a volume into the scene a slower process. But it isn't too bad, at least on my system, and the result is definitely very pleasing to look at.
Sometimes the volume display doesn't update after thresholding and this helps to nudge it to update.
They don't really belong to the skinseg module
0d3fe27 to
0ca336a
Compare
Closes #387
One thing I slightly dislike about this method is that it modifies the values in the volume array: it makes the background voxels take a value that is 1 less than the minimum value in the volume. This way they can simply be thresholded out. There should be other ways to do this that involve creating a segmentation node out of the foreground mask and then using it as a "cropping ROI" for the volume. This would be nicer but then it's another piece of data to manage. This method keeps things simple, and it doesn't seem to harm the downstream skin segmentation computation.
Another thing I slightly dislike about this method is that it takes some time to compute the foreground mask, so this makes loading a volume into the scene a slower process. But it isn't too bad, at least on my system, and the result is definitely very pleasing to look at.
For review: Load a volume into the scene by loading a session, or by doing it manually, and observe that it is immediately masked after being loaded in. Give the code a cursory look.