Conversation
|
CRAZY COOOOL! this is VERY complete |
There was a problem hiding this comment.
Pull request overview
This PR adds support for the Tecan Spark plate reader by implementing a comprehensive async backend with USB communication, packet parsing, and data processing capabilities. The implementation includes controls for various instrument subsystems and processors for absorbance and fluorescence measurements.
Key Changes
- New async USB reader with packet parsing for Tecan Spark devices
- Absorbance and fluorescence data processors with real-world test coverage
- Comprehensive control modules for plate transport, optics, sensors, measurements, and system configuration
- Utility function for calculating non-overlapping well rectangles
Reviewed changes
Copilot reviewed 23 out of 24 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| pylabrobot/plate_reading/utils.py | Adds helper to find non-overlapping rectangles covering plate wells |
| pylabrobot/plate_reading/tecan/spark20m/spark_reader_async.py | Implements async USB communication layer for Spark devices |
| pylabrobot/plate_reading/tecan/spark20m/spark_packet_parser.py | Parses binary measurement data packets from the device |
| pylabrobot/plate_reading/tecan/spark20m/spark_processor.py | Processes raw measurement data into absorbance/fluorescence values |
| pylabrobot/plate_reading/tecan/spark20m/spark_backend.py | Main backend integrating all components for plate reading operations |
| pylabrobot/plate_reading/tecan/spark20m/controls/*.py | Control modules for device subsystems (optics, sensors, movement, etc.) |
| pylabrobot/plate_reading/tecan/spark20m/*_tests.py | Unit tests for reader, processor, and backend components |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
pylabrobot/plate_reading/tecan/spark20m/controls/system_control.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
is this ready for review? |
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove AutoReaderProxy, integrate reading into send_command - Remove reading context manager (no longer needed) - Rename control classes to follow PEP8 (BaseControl, PlateControl, etc.) - BaseControl now takes send_command directly instead of reader - Make init_read and get_response private methods - Add io.binary.Reader utility class Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Code review changesSimplified the Spark control architecture:
|
# Conflicts: # pylabrobot/io/binary.py # pylabrobot/io/usb.py
fc9cf36 to
b56ffdd
Compare
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace custom byte parsing helpers (_read_bytes, _read_u8, _read_u16, _read_u32, _read_string) with the Reader class from pylabrobot.io.binary. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use pytest.approx() instead of exact equality for float comparisons in test_process_real_data tests. Compare row by row since pytest.approx does not support nested data structures. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add max_workers parameter to USB class constructor with default of 1 to maintain backward compatibility. SparkReaderAsync passes max_workers=16 for its specific needs. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The PR fixed the timeout conversion for the main write but missed the zero-length packet write. PyUSB expects timeout in milliseconds, so timeout must be multiplied by 1000. Also add max_workers parameter to USBValidator for signature compatibility. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace manual calculation of well spacing with the new item_dx and item_dy properties from ItemizedResource. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
instead of returning False on error, can we raise the error in send command? this is a more pythonic way of dealing with errors |
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove unnecessary .keys() calls in dictionary iteration - Use None instead of -1 as sentinel value with is/is not checks - Remove unnecessary list() conversions when iterating bytes - Use statistics.mean() for average temperature calculation Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Methods like set_camera_aoi, set_bpp, set_cam_gain were duplicates of set_camera_area_of_interest, set_camera_bits_per_pixel, set_camera_gain. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The parameter is actually used in scan_plate_range call. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
DARK_BLOCK_INDEX, REFERENCE_BLOCK_INDEX, and UINT16_MAX make the calibration block indices and ADC resolution self-documenting. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
Both send |
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Cancel response_task in finally block if an exception occurs before it completes. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
TemperatureDevice, TemperatureState, and ChillerState were defined in both files. Keep the ones in spark_enums (PascalCase) and import. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Log when all ratios are NaN or when avg_ratio is non-positive, instead of silently returning NaN. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Helps debug malformed packets from USB corruption or firmware issues. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
FluorescenceCarrier has duplicate enum valuesclass FluorescenceCarrier(Enum):
MONOCHROMATOR_EXCITATION = "MONO"
MONOCHROMATOR_EMISSION = "MONO" # Same value as aboveBoth If this is intentional because the hardware uses the same identifier for both, consider documenting why. Otherwise, one of these values may be incorrect. |
merged two enums to MONOCHROMATOR. I don't see the need to distinguish them |
fixed with correct command strings |
The cancel() call only requests cancellation - the task continues running until the next await point. This change awaits the task after cancellation to ensure proper cleanup and prevent resource leaks. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The previous implementation used an attempt counter (default 10000) which could spin for 100+ seconds with 10ms sleeps. Now uses time.monotonic() to enforce a wall-clock deadline (default 60 seconds), making the timeout behavior predictable and bounded. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Use _safe_div() for the (raw_ref_signal - ref_dark) denominator to return NaN instead of raising ZeroDivisionError when reference signal equals reference dark value. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
MirrorCarrier, LaserPowerState, and LightingState were already defined in spark_enums.py. Import them instead of redefining. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…trol These enums are already defined in spark_enums.py. Import them instead. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add explicit None check for inner_mult_index in _parse_nested_mult call - Import ScanDirection from spark_enums instead of re-exporting through measurement_control Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
is it a problem that |
|
how does it look now? |
tecan spark WIP. luminescence coming up soon. i had a hard time to get io.usb to work with this, so i'm directly using libusb for now