Skip to content

Conversation

philippotto
Copy link
Member

@philippotto philippotto commented Jan 14, 2025

This PR adds rendering support for uint32 color layers, int8, int16, int32 for color + segmentation, and int64 for segmentation layers.

Histogram and thumbnail support were only done partially. The rest can happen in another PR.

URL of deployed dev instance (used for testing):

  • https://___.webknossos.xyz

Steps to test:

  • see 15 new datasets (dtype_test_*) on dev instance
  • monkey test them (also use histogram view, brushing and editable mappings)
  • no need to test all combinations, because there are automated screenshot tests for that

TODOs:

dtype color layers segmentation layers histogram thumbnail
uint8 unchanged unchanged unchanged unchanged
uint16 unchanged unchanged unchanged unchanged
uint24 rgb unchanged does not apply unchanged unchanged
uint32 ☑️ unchanged unchanged unchanged
uint64 unchanged unchanged unchanged (segmentation only)
int8 ☑️ ☑️ ☑️ ☑️
int16 ☑️ ☑️ ☑️ ☑️
int32 ☑️ ☑️ ☑️ ☑️
int64 ☑️
float unchanged unchanged ☑️
double

Issues:


(Please delete unneeded items, merge only when none are left open)

@philippotto philippotto self-assigned this Jan 14, 2025
Copy link
Contributor

coderabbitai bot commented Jan 14, 2025

📝 Walkthrough

Walkthrough

This large pull request makes extensive adjustments across the frontend and backend code. In the JavaScript code, many method signatures and type definitions are updated—such as replacing union types with a generalized TypedArray and switching from Vector2 to tuple types for intensity ranges—while texture initialization, uniform handling, and error message construction are streamlined. Shader modules and test cases are refactored to remove redundant parameters and improve clarity. In addition, multiple Scala files in the datastore and tracingstore have migrated from using UnsignedInteger types to a new SegmentInteger type, refining data conversion and histogram logic.

Changes

File(s) Change Summary
frontend/.../libs/{UpdatableTexture.ts, abstract_cuckoo_table.ts, utils.ts, messages.tsx} Updated method signatures and type definitions; streamlined texture creation and error messaging; enhanced typed array handling and signed number conversion.
frontend/.../oxalis/{api, constants, geometries, accessors, model, shaders, view, store, sagas} Revised type definitions (e.g., TypedArray, intensityRange), updated uniform and shader functions, replaced bit depth/channel count logic with supported range methods, and integrated the Ace editor for shader editing.
frontend/.../test/{model, puppeteer, shaders, fixtures} Simplified test setups by removing obsolete parameters, updated function signatures, and added new screenshot helpers and constants for clearer dataset rendering tests.
webknossos-{datastore, tracingstore}/** Replaced UnsignedInteger types with a new SegmentInteger; updated conversion, protocol, and histogram logic for segmentation and volume tracing; refined Scala service implementations.
Others (.github/workflows, package.json, AnnotationController.scala, AnnotationService.scala, CHANGELOG.unreleased.md) Added a new dependency, enhanced workflow URL generation through dynamic branch naming, revised dataset argument usage in controllers/services, and updated the changelog with new data type support and a command palette feature.

Suggested labels

enhancement, testing, bug

Suggested reviewers

  • hotzenklotz
  • daniel-wer

Poem

Hop, hop, I’m a rabbit on the run,
Leaping through code with lots of fun,
Typed arrays now dance in a neat parade,
Bugs and clutter gracefully fade,
With each new fix, our forest gleams bright—
A joyful hop into a cleaner byte!
🥕🐇

Tip

⚡🧪 Multi-step agentic review comment chat (experimental)
  • We're introducing multi-step agentic chat in review comments. This experimental feature enhances review discussions with the CodeRabbit agentic chat by enabling advanced interactions, including the ability to create pull requests directly from comments.
    - To enable this feature, set early_access to true under in the settings.
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (1)

82-99: Consider tightening defaultIntensityRange for 64-bit types.
Currently, uint64/int64 both default to a [0..255] range, relying on the fact that segmentation layers rarely call this method. You could optionally throw an error or log a warning if 64-bit segmentation data calls defaultIntensityRange by mistake.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between aa22c3b and 19dffff.

📒 Files selected for processing (8)
  • app/controllers/AnnotationIOController.scala (1 hunks)
  • app/models/annotation/AnnotationService.scala (3 hunks)
  • app/models/annotation/nml/NmlParser.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/helpers/ProtoGeometryImplicits.scala (1 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (5 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/RemoteFallbackLayer.scala (2 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala (4 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala (8 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/helpers/ProtoGeometryImplicits.scala
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/editablemapping/EditableMappingLayer.scala
  • app/models/annotation/AnnotationService.scala
🧰 Additional context used
🧠 Learnings (1)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (2)
Learnt from: MichaelBuessemeyer
PR: scalableminds/webknossos#8325
File: webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala:80-93
Timestamp: 2025-03-12T13:04:06.063Z
Learning: In WebKnossos, uint64 and int64 element classes are only supported for segmentation layers, where the frontend uses a fixed range of 0-255, and defaultIntensityRange should not be called for these types.
Learnt from: MichaelBuessemeyer
PR: scalableminds/webknossos#8325
File: webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala:80-93
Timestamp: 2025-03-12T13:04:06.063Z
Learning: In WebKnossos, the full range of 64-bit integers (-2^63 to 2^63-1) is used in the frontend for int64 layers, with potential precision loss being accepted as an unsupported value range.
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: circleci_build
🔇 Additional comments (21)
app/controllers/AnnotationIOController.scala (2)

336-339: Improved element class handling

The code now uses ElementClass.toProto instead of direct assignment, and wraps the result in a Fox monad with .toFox. This change aligns with the PR objective of supporting additional data types (int8, int16, int32, int64) for rendering.


343-343: Variable name updated to match its content

The variable name change from elementClass to elementClassProto more accurately describes that the value is a protobuf representation of the element class.

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/VolumeTracingService.scala (8)

224-227: Well-aligned type migration for expanded dtype support

The code has been updated to use SegmentIntegerArray and SegmentInteger instead of UnsignedIntegerArray and UnsignedInteger. This change supports the PR objective of rendering more data types, including signed integers.


757-758: Clear and appropriate variable renaming

Renaming from elementClass to elementClassProto improves code clarity by explicitly indicating that this is a protobuf representation.


786-786: Consistent variable update

The updated constructor call properly uses the renamed elementClassProto variable.


796-800: Consistent variable update in error message

The error message reporting now correctly uses the renamed elementClassProto variable, maintaining consistency throughout the codebase.


819-820: Consistent parameter update

The saveBucket call has been updated to use the renamed elementClassProto parameter.


830-831: Consistent parameter update

The updateSegmentIndex call has been updated to use the renamed elementClassProto parameter.


905-905: Support for signed integer values

Changed from toPositiveLong to toLong, which correctly allows for negative segment IDs. This is essential for supporting signed integer types as mentioned in the PR objectives.


920-923: Code structure improvement

The code structure has been slightly improved, making the flow more readable while maintaining the same functionality.

webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/RemoteFallbackLayer.scala (3)

11-11: Import added for ElementClass

The import for ElementClass has been added to support the new element class conversion functionality in the fromDataLayerAndDataSource method. This is a necessary addition for the enhanced dtype support.


14-14: Box import added for error handling

The import for net.liftweb.common.Box has been added to support the return type change of fromDataLayerAndDataSource. This enables proper error handling when element class conversion fails.


24-28:

✅ Verification successful

Improved error handling in fromDataLayerAndDataSource method

The method signature has been changed from returning RemoteFallbackLayer to Box[RemoteFallbackLayer], which is a significant improvement for error handling. The implementation now properly handles potential failures in element class conversion by using the Box monadic container.

This change aligns with the PR objective of supporting more data types, as it allows graceful handling of conversion failures for the newly supported dtypes (uint32, int8, int16, int32, int64).


🏁 Script executed:

#!/bin/bash
# Check for any places that call this method to ensure they handle the new Box return type
rg "fromDataLayerAndDataSource" --type scala -A 3 -B 3

Length of output: 2428


Verified: Improved Error Handling for fromDataLayerAndDataSource

The update changing the method’s return type from RemoteFallbackLayer to Box[RemoteFallbackLayer] has been confirmed. This change correctly leverages the Box monadic container to gracefully handle conversion failures (for the new supported data types such as uint32, int8, int16, int32, int64). Moreover, our verification shows that the call site in VolumeTracingService.scala properly processes the Box result by chaining with .toFox.

  • The method conversion now safely wraps potential failures.
  • Callers (e.g., in VolumeTracingService.scala) correctly handle the Box type.
  • The implementation aligns with the PR objective for enhanced error management.

Approving these changes.

app/models/annotation/nml/NmlParser.scala (1)

12-12: LGTM: Addition of ElementClassProto import.

This import facilitates the transition from using ElementClass to ElementClassProto, which aligns with the PR's objective of supporting additional data types for rendering.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (7)

8-8: Import usage confirmed.
This new import for ElementClassProto is clearly needed for the added fromProto and toProto functions.


26-26: New imports for Liftweb components are valid.
These imports (Box, Failure, Full) are suitable for the newly introduced return types in toProto and error handling.


48-48: Expanded segmentationElementClasses set.
Including int8, int16, int32, and int64 aligns with the PR objective of supporting more signed integer types for segmentation.


67-80: isSigned method appears consistent.
Treating float/double as “true” for signedness is reasonable. The mapping covers all element classes without gaps.


115-128: fromProto mapping is straightforward.
All new integer element classes are handled. The thrown exception for unrecognized values is appropriate.


130-141: Protecting toProto from unsupported classes.
Returning a Failure if the element class is not mapped prevents hidden errors. Consider clarifying float/double usage in the future if needed.


145-152: Validate whether negative IDs should be allowed.
The code sets max segment IDs for signed types yet also assumes largestSegmentId >= 0. If negative segment IDs are truly unsupported, this is fine. Otherwise, the system might erroneously reject valid negative IDs.

editPosition = nmlParams.editPosition,
editRotation = nmlParams.editRotation,
elementClass = ElementClass.uint32, // Note: this property may be adapted later in adaptPropertiesToFallbackLayer
elementClass = ElementClassProto.uint32, // Note: this property may be adapted later in adaptPropertiesToFallbackLayer
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Consider dynamic element class assignment instead of hardcoding to uint32.

While changing to ElementClassProto.uint32 is consistent with the import change, the PR objective is to support multiple data types (int8, int16, int32, int64, etc.). Hardcoding to uint32 might limit the flexibility needed for the enhanced data type support.


🏁 Script executed:

#!/bin/bash
# Check if there are any other places where ElementClassProto is used dynamically

# Look for ElementClassProto references
rg -A 5 "ElementClassProto" --type scala

# Check if there is any code that determines element class based on data type
rg -A 10 "elementClass =" --type scala

Length of output: 56119


Action Required: Use Dynamic Element Class Assignment in NmlParser.scala

The current implementation in NmlParser.scala (line 78) still hardcodes elementClass as ElementClassProto.uint32. Given that the PR aims to support multiple data types (e.g., int8, int16, int32, int64, etc.), this reduces flexibility and may cause inconsistencies with other parts of the codebase that derive the element class dynamically.

  • File: app/models/annotation/nml/NmlParser.scala
    Line: 78
    Issue: elementClass is hardcoded instead of being computed based on context (for example, using input parameters or configuration similar to dynamic assignments in VolumeTracing or DataLayer modules).

Please consider refactoring the assignment to derive elementClass dynamically and ensure it aligns with the enhanced data type support objective. Alternatively, add a clear justification if dynamic assignment is to be deferred.

Copy link
Member

@fm3 fm3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Backend LGTM, retest looked good.

Could not test segment stats since there is no signed int segmentation dataset with a segment index file. Do you know if the segment index file format would support this, especially with negative IDs? Otherwise the job to compute one might fail for these segmentation layers. Same question for the compute meshes job.

I guess from now on we need to test all new segmentation/volume/mapping/proofreading/mesh features also with segmentations with negative ids… Admittedly, such datasets are rarely used by users and maybe we can fix some issues later as they occur.

@philippotto
Copy link
Member Author

Could not test segment stats since there is no signed int segmentation dataset with a segment index file.

How can I trigger the index computation? Is it done on DS upload (if the worker is set up)? my suggestion would be to test this once it's deployed and then schedule a follow up if necessary.

I guess from now on we need to test all new segmentation/volume/mapping/proofreading/mesh features also with segmentations with negative ids… Admittedly, such datasets are rarely used by users and maybe we can fix some issues later as they occur.

I think, we don't necessarily need to add this to our testing agenda. as you say, such DS should be very rare and the current PR ensures that at least basic functionality (like rendering, annotating and ad hoc meshing etc) works. if users run into problems, we can fix that ad hoc. prior to this PR, DS with negative ids wouldn't work either, so it's still an improvement.

@fm3
Copy link
Member

fm3 commented Mar 19, 2025

How can I trigger the index computation?

I think it’s an item in the layer sidebar for segmentation layers. There is a function getComputeSegmentIndexFileButton in layer_settings_tab.tsx

schedule a follow up if necessary

Ok, agreed!

@MichaelBuessemeyer
Copy link
Contributor

Wasn't the ComputeSegmentIndexFile job removed 🤔?

If I recall correctly some months ago you deprecated one of the jobs @fm3

@fm3
Copy link
Member

fm3 commented Mar 19, 2025

If I recall correctly some months ago you deprecated one of the jobs

That would probably be globalize_floodfills in #7173 ? I don’t know about any other removed job 🤔

Copy link
Contributor

@MichaelBuessemeyer MichaelBuessemeyer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Latest frontend changes lgtm. Is retesting the fixes needed before merging or did you do that already @philippotto?

@philippotto
Copy link
Member Author

Latest frontend changes lgtm. Is retesting the fixes needed before merging or did you do that already @philippotto?

I'll do that before merging. thanks for the approval!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Complete dtype feature support
3 participants