By adhering to the Don't Repeat Yourself (DRY) principle, this library makes your machine-learning projects easier to replicate, document, and reuse.
- Experimental Scope: All logic runs within a controlled scope, preventing unintended dependencies, data leakage, and misconfiguration.
- Modularity: Components communicate via defined protocols, providing type safety and flexibility for custom implementations.
- Decoupled Tracking: Logging, plotting, and metadata are handled by an event system that separates execution from tracking.
- Lean Dependencies: Minimal core requirements while supporting optional external libraries (Hydra, W&B, TensorBoard, etc.).
- Self-Documentation: Metadata is automatically extracted in a standardized and robust manner.
- Ready-to-Use Implementations: Advanced functionalities with minimal boilerplate, suitable for a wide range of ML applications.
Requirements The library only requires recent versions of PyTorch and NumPy. PyYAML and tqdm are recommended for better tracking.
uv add drytorchFolders are organized as follows:
- Core (
core): The library kernel. Contains internal routines and the interfaces for defining custom components. - Standard Library (
lib): Reusable implementations and abstract classes of the protocols. - Trackers (
tracker): Optional tracker plugins that integrate via the event system. - Contributions (
contrib): Dedicated space for community-driven extensions. - Utilities (
utils): Functions and classes independent to the framework.
Read the full documentation on Read the Docs →
The documentation includes:
- Tutorials - Complete walkthrough
- API Reference - Detailed API documentation
- Architecture Overview - Design principles and structure
