-
Notifications
You must be signed in to change notification settings - Fork 1
Add differentiable Gray-coded QAM modulation and demodulation #41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…es with unit tests
Add new image compressors: JPEG2000, JPEG XL, and WebP - Implemented JPEG2000Compressor with configurable quality and lossless options. - Implemented JPEGXLCompressor with support for lossless compression and adjustable effort. - Implemented WebPCompressor with options for quality, lossless mode, and compression method. - Updated __init__.py to include new compressors in the module. - Added unit tests for JPEG2000, JPEG XL, and WebP compressors to ensure functionality. - Updated test script to include tests for new compressors. - Updated requirements to include necessary dependencies for new compressors.
- Introduced SampleDataLoader class to unify loading of dataset and standard test images. - Removed the old load_sample_images function and integrated its functionality into the new class. - Updated example scripts and tests to utilize the new SampleDataLoader interface. - Added methods for downloading and caching standard test images. - Enhanced error handling and added support for resizing images during loading. - Removed the deprecated download_example_images.py script.
- Added ModelConfig class for unified model configuration management. - Updated BaseModel to support configuration objects (PretrainedConfig, DictConfig, dict). - Introduced KairaTrainer for unified training of communication models with flexible argument handling. - Implemented TrainingArguments class to support Hydra and dict-based configurations. - Added tests for new dataset loading functionalities and improved error handling in image downloading. - Updated requirements to include necessary libraries for new features.
…asses; enhance documentation and error handling in sample data tests
- Introduced test_init.py to validate imports and exports in kaira.data module. - Enhanced test_sample_data.py to include tests for download_image function. - Created test_training.py with extensive coverage for Trainer and TrainingArguments classes, including various initialization scenarios and configuration handling. - Added tests for extracting configuration values and ensuring proper parameter filtering in TrainingArgumentsMixin. - Verified integration between different components of the training module.
…e training configurations for models
…nce dataset filtering
- Updated `kaira_train.py` to streamline model loading and configuration parsing. - Removed unnecessary parameters from `load_model_from_config` function. - Simplified configuration loading logic using Hydra and OmegaConf. - Enhanced `create_training_arguments_from_args` to support direct CLI argument parsing. - Removed extensive test cases for training arguments and trainer due to redundancy. - Ensured all tests are aligned with the new configuration handling approach.
- Updated kaira_train.py to streamline model loading and configuration parsing - Removed unnecessary parameters from load_model_from_config function - Simplified configuration loading logic using Hydra and OmegaConf - Enhanced create_training_arguments_from_args to support direct CLI argument parsing - Removed extensive test cases for training arguments and trainer due to redundancy - Ensured all tests are aligned with the new configuration handling approach (Cherry-picked from 8f2293c with conflict resolution)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds differentiable Gray-coded QAM modulation and demodulation capabilities to enable end-to-end learnable communication channels. Key changes include:
- Introducing a differentiable parameter in the QAM modulator with corresponding behavior changes.
- Adding an approximate demodulation mode with a new LLR calculation branch.
- Refactoring squared distance computations for LLR calculations in the demodulator.
# Find minimum distance across all points | ||
min_distances, _ = torch.min(squared_distances, dim=-1) | ||
return min_distances | ||
return squared_distances |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the _min_squared_distance function, the torch.min reduction is removed, causing the function to return the full squared distances instead of the minimum. This change breaks the expected behavior for LLR calculations; please ensure that _min_squared_distance returns the minimum squared distances as originally intended.
Copilot uses AI. Check for mistakes.
…ization across examples
Description
This pull request introduces fundamental building blocks for learned communication systems by adding differentiable Quadrature Amplitude Modulation (QAM) and Demodulation capabilities to the library.
Currently, our toolkit lacks differentiable physical layer components, which prevents the end-to-end training of neural networks that include a communication channel. These changes enable researchers and engineers to build and train models like neural autoencoders for the physical layer.
The two commits in this PR achieve the following:
Differentiable Gray-coded QAM Modulation (45364d5): A new function is added to map input bits to complex-valued QAM symbols. It uses Gray coding to ensure that adjacent symbols in the constellation differ by only a single bit, which is crucial for minimizing the bit error rate (BER) during demodulation.
Differentiable Exact & Approximate QAM Demodulation (3b5ee5e): Two demodulation methods are introduced:
An exact demodulator that calculates distances to all points in the constellation assuming equal probability for all symbols. This provides a smooth, fully differentiable function suitable for backpropagation during training.
An approximate demodulator that uses the distance to the nearest neighbor in the constellation. While its gradient might be less smooth (due to the min operation), it is useful for fast demodulation for higher order constellations.
Type of Change
Please mark the relevant option:
Checklist
Additional Information
Any additional information, screenshots, or context that would help reviewers understand the changes.