Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve client-side error handling #122

Open
pieper opened this issue Aug 23, 2022 · 18 comments
Open

Improve client-side error handling #122

pieper opened this issue Aug 23, 2022 · 18 comments
Assignees
Labels
data Likely due to non-conformant data enhancement New feature or request idc:priority Priority for the NCI's Imaging Data Commons

Comments

@pieper
Copy link
Member

pieper commented Aug 23, 2022

The interface shows for a bit and then the browser goes blank.

https://viewer.imaging.datacommons.cancer.gov/slim/studies/2.25.174442227011591161839224535127252003551/series/1.3.6.1.4.1.5962.99.1.2352175091.700331458.1655914616819.4.0

This message shows up in the developer console:

react-dom.production.min.js:189 Error: Pyramid of optical path "1" is different from reference pyramid.
    at O (viewer.js:1033:17)
    at new e (viewer.js:948:71)
    at ke (SlideViewer.tsx:111:24)
    at new n (SlideViewer.tsx:433:43)
    at Ba (react-dom.production.min.js:147:172)
    at Nu (react-dom.production.min.js:198:97)
    at xi (react-dom.production.min.js:292:172)
    at bs (react-dom.production.min.js:280:389)
    at gs (react-dom.production.min.js:280:320)
    at vs (react-dom.production.min.js:280:180)

Desired behavior would be at least an error message to the user and graceful handling. Even better would be to handle this kind of data (although that may not be possible if the data is invalid for some reason).

@hackermd hackermd self-assigned this Aug 23, 2022
@hackermd hackermd added enhancement New feature or request idc:priority Priority for the NCI's Imaging Data Commons data Likely due to non-conformant data labels Aug 23, 2022
@hackermd
Copy link
Collaborator

Thanks @pieper for reporting the issue. We will need to investigate whether there is a problem with the data. However, I agree that the error handling will also need to be improved.

@hackermd
Copy link
Collaborator

Errors that are thrown upon parsing of the data are difficult to catch/handle, because they happen deep in the library. It may be cleanest to introduce an error handler callback that allows the library to include a detailed message for known data issues that is explicitly intended for display to the user.

@fedorov
Copy link
Member

fedorov commented Aug 31, 2022

@hackermd since this is affecting IDC production viewer, it is a priority for IDC to at least know what is going on here. What can we do to expedite investigation of this issue?

@fedorov
Copy link
Member

fedorov commented Aug 31, 2022

cc: @Punzo

@Punzo
Copy link
Contributor

Punzo commented Aug 31, 2022

In general we need to improve the error handler in Slim.

I see 3 current issues regarding error handling

#106
#122
#123

We can talk the various specifics tomorrow, but in general error handler should not block the viewer. In addition, the error logger service should always report in a popup dialog or separate debug window (like we do for OHIF).

For customized error handler we can implement indeed callbacks that we can setup in the config file, e.g.: https://github.com/OHIF/Viewers/blob/master/platform/viewer/public/config/idc.js#L12

Anyway i have put this already in tomorrow agenda https://cornerstonejs.slack.com/archives/GP0DDMVMF/p1660725079247979

@Punzo
Copy link
Contributor

Punzo commented Aug 31, 2022

Regarding this issue: simply optical paths with different geometrical (e.g. orientation, spacing, etc) properties are not supported. We could add a registration phase, but it would be very expensive client side. Or it could be that simply the metadata are wrong, but then the viewer can't work if we can't trust the metadata.

@hackermd
Copy link
Collaborator

hackermd commented Sep 1, 2022

@fedorov I have started to work on improving the error handling. As stated above, the viewer should definitely handle such errors more gracefully. Providing a meaningful error message to the user that contains the right level of detail will require some thought and development work.

To fix the underlying issue, we will also need to investigate the data, because the issue is most likely due to a problem with the data rather than a bug in the viewer. The viewer requires pyramids of different channels to have the same structure. That does not seem to be the case here.

Image with Optical Path Identifier "1":

=== Pyramid Level 0 ===
Total Pixel Matrix Rows/Columns: (29226, 33635)
Pixel Spacing: (0.000649983750406, 0.000649983750406)
Downsampling Factors: (1.0, 1.0)
=== Pyramid Level 1 ===
Total Pixel Matrix Rows/Columns: (14613, 16818)
Pixel Spacing: (0.001299928852712, 0.001299928852712)
Downsampling Factors: (2.0, 1.9999405398977286)
=== Pyramid Level 2 ===
Total Pixel Matrix Rows/Columns: (7307, 8409)
Pixel Spacing: (0.002599857705424, 0.002599857705424)
Downsampling Factors: (3.9997262898590393, 3.999881079795457)
=== Pyramid Level 3 ===
Total Pixel Matrix Rows/Columns: (3654, 4205)
Pixel Spacing: (0.005199097133154, 0.005199097133154)
Downsampling Factors: (7.998357963875205, 7.998810939357908)
=== Pyramid Level 4 ===
Total Pixel Matrix Rows/Columns: (1827, 2103)
Pixel Spacing: (0.010395722037524, 0.010395722037524)
Downsampling Factors: (15.99671592775041, 15.993818354731337)
=== Pyramid Level 5 ===
Total Pixel Matrix Rows/Columns: (914, 1052)
Pixel Spacing: (0.020781562209994, 0.020781562209994)
Downsampling Factors: (31.975929978118163, 31.972433460076047)
=== Pyramid Level 6 ===
Total Pixel Matrix Rows/Columns: (457, 526)
Pixel Spacing: (0.041563124419988, 0.041563124419988)
Downsampling Factors: (63.951859956236326, 63.944866920152094)

Image with Optical Path Identifier "0":

=== Pyramid Level 0 ===
Total Pixel Matrix Rows/Columns: (29226, 33635)
Pixel Spacing: (0.000649999976158, 0.000649999976158)
Downsampling Factors: (1.0, 1.0)
=== Pyramid Level 1 ===
Total Pixel Matrix Rows/Columns: (14613, 16818)
Pixel Spacing: (0.001299961303251, 0.001299961303251)
Downsampling Factors: (2.0, 1.9999405398977286)
=== Pyramid Level 2 ===
Total Pixel Matrix Rows/Columns: (7307, 8409)
Pixel Spacing: (0.002599922606502, 0.002599922606502)
Downsampling Factors: (3.9997262898590393, 3.999881079795457)
=== Pyramid Level 3 ===
Total Pixel Matrix Rows/Columns: (3654, 4205)
Pixel Spacing: (0.005199226919876, 0.005199226919876)
Downsampling Factors: (7.998357963875205, 7.998810939357908)
=== Pyramid Level 4 ===
Total Pixel Matrix Rows/Columns: (1827, 2103)
Pixel Spacing: (0.010395981549253, 0.010395981549253)
Downsampling Factors: (15.99671592775041, 15.993818354731337)
=== Pyramid Level 5 ===
Total Pixel Matrix Rows/Columns: (914, 1052)
Pixel Spacing: (0.020782080986767, 0.020782080986767)
Downsampling Factors: (31.975929978118163, 31.972433460076047)
=== Pyramid Level 6 ===
Total Pixel Matrix Rows/Columns: (457, 526)
Pixel Spacing: (0.041564161973534, 0.041564161973534)
Downsampling Factors: (63.951859956236326, 63.944866920152094)

Note that the Pixel Spacing is different although the Total Pixel Matrix Rows/Columns are the same (the Imaged Volume Height/Width are also different).

That information can also be found in the console:
Screen Shot 2022-08-31 at 8 02 35 PM

@dclunie could you kindly help us debug these data sets?

@fedorov
Copy link
Member

fedorov commented Sep 1, 2022

Thank you @hackermd and @Punzo - let's wait to hear from David. To me, it makes sense to add some logic to the converter to detect this kind of issues at the time of conversion.

@hackermd
Copy link
Collaborator

hackermd commented Sep 1, 2022

I agree @fedorov. Ideally, we would avoid these issues upon conversion. At the same time, we want the viewer to be as robust as possible and at least handle such errors gracefully.

@dclunie
Copy link

dclunie commented Sep 1, 2022

Interesting - not sure why there is a 0.002 % difference between the pixel spacing values in the first place - I will look into this.

That said, it might be better if the viewer didn't try to compare floating point values exactly, as opposed to within a specified tolerance:

http://randomascii.wordpress.com/2012/02/25/comparing-floating-point-numbers-2012-edition/

@Punzo
Copy link
Contributor

Punzo commented Sep 1, 2022

the current eps tolerance is 1.e-6. We could make it less strict if needed.

https://github.com/herrmannlab/dicom-microscopy-viewer/blob/b8393862f3d23f7f3c750505b36873dba4e46a0f/src/utils.js#L421

@fedorov
Copy link
Member

fedorov commented Sep 1, 2022

Related to OHIF/Viewers#2869

@dclunie
Copy link

dclunie commented Sep 1, 2022

Perhaps a percentage rather than an absolute value might be better

@hackermd
Copy link
Collaborator

hackermd commented Sep 1, 2022

@dclunie as @Punzo pointed out, we already use a tolerance for comparing floating-point values. We could choose a different tolerance (considering that many values are in millimeter unit and we need high precision). However, I don't think this issue is merely a problem with floating-point arithmetic. I tend argue that if images are supposed to have the same pyramid structure (and especially if they have the same Pyramid UID), then the values for Pixel Spacing shall - at least in theory - be identical between the two instances. In fact, DICOM Part 3 Section A.1.2.30 Multi-Resolution Pyramid IE explicitly states:

Each layer is encoded as a separate DICOM Image, each of which has a uniform resolution (same value for Pixel Spacing (0028,0030)).

In my opinion, if rounding errors get introduced during processing (conversion, resampling, etc.), then it is the responsibility of the equipment that performs the processing and generates the DICOM data set to resolve any inconsistencies and ensure that constraints that are imposed by the standard (e.g., pyramid structure) are actually met. We need to make sure that all instances in a series have the same Series Instance UID, Frame of Reference UID, etc. We should apply the same rigor to the constraints of other information entities, including multi-resolution pyramids.

One could argue that a 0.002 % (~1 nanometer) difference doesn't matter in practice and should be tolerated, but problems like these are in my experience often indications of wider problems with a data set. I don't want to push back on improving the application logic (there is definitely room for improvement - in particular with respect to error handling and reporting), but we can't expect the viewer to always provide a workaround for problematic data.

@dclunie
Copy link

dclunie commented Sep 1, 2022

I completely agree, and I will look into why it is occurring.

Practical experience with radiology 3D data over the years suggests though that having too small an epsilon causes problems. Since the viewer is not being used as a quality control tool, making it too demanding doesn't really help anyone, though in this case if it had just worked, we wouldn't have noticed the problem in the data.

@hackermd
Copy link
Collaborator

hackermd commented Sep 1, 2022

Fully agreed! We will need to find the right balance between being strict enough for the app to function properly (including not introducing significant errors with spatial coordinates when creating annotations) and being tolerant enough not to just abort for every small inconsistency in the data. We wanted to be very strict by default to be on the safe side.

As discussed during the IDC earlier today, I would advise against changing the tolerance for the floating-point value comparison in general. We probably all agree that the two values are indeed different. Instead, I would suggest handling the fact that the values are different at a higher level. In this case, we could allow the value of Pixel Spacing to be different between instances by a certain tolerance, which is bigger than the tolerance that we use to determine whether two floating-point values are the same. I would suggest specifying that tolerance in millimeter (or micrometer) unit. We could also make that configurable either upon installation/deployment or at runtime (as suggested by @fedorov).

The main question is how tolerant do we want to be?

@hackermd
Copy link
Collaborator

@dclunie I've started to refactor the code to provide a tolerance for the values of the Pixel Spacing and the Y/X Offset in Slide Coordinate System attributes. Now, I am wondering what tolerance we should use and realized that there are some complications.

The offset of the images in the slide coordinate system should generally be the same, but since the offset is (or shall be) specified with respect to the center of the top left pixel, the values may nevertheless vary between images. How much deviation should be expect/tolerate? The expected difference could in principle be computed exactly, but my guess is that a lot of implementations may get this wrong, for example by not updating the values when downsampling (not sure whether ours is even fully standard compliant in this regard?).

For Pixel Spacing, we may want a different absolute tolerance for each resolution level, because a small deviation at the top of the pyramid may translate to a very large deviation at the bottom of the pyramid.

What are your thoughts?

@hackermd hackermd changed the title Blank screen with HTAN data Improve client-side error handling Oct 5, 2022
@fedorov
Copy link
Member

fedorov commented Apr 27, 2023

Has this been addressed by #134? Need to review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
data Likely due to non-conformant data enhancement New feature or request idc:priority Priority for the NCI's Imaging Data Commons
Projects
None yet
Development

No branches or pull requests

5 participants