Releases: Vivswan/AnalogVNN
Releases · Vivswan/AnalogVNN
v1.0.8
v1.0.7
v1.0.6
What's Changed
Model
is subclass ofBackwardModule
for additional functionality.- Using
inspect.isclass
to check ifbackward_class
is a class inLinear.set_backward_function
. - Repr using
self.__class__.__name__
in all classes.
Full Changelog: v1.0.5...v1.0.6
v1.0.5
What's Changed (Patches for Pytorch 2.0.1)
- Patch for Pytorch 2.0.1, add filtering inputs in
BackwardGraph._calculate_gradients
. - Removed unnecessary
PseudoParameter.grad
property.
Full Changelog: v1.0.4...v1.0.5
v1.0.4
What's Changed
- Combined
PseudoParameter
andPseudoParameterModule
for better visibility- BugFix: fixed save and load of state_dict of
PseudoParameter
and transformation module
- BugFix: fixed save and load of state_dict of
- Removed redundant class
analogvnn.parameter.Parameter
Full Changelog: v1.0.3...v1.0.4
v1.0.3
What's Changed
- Added support for no loss function in
Model
class- If no loss function is provided, the
Model
object will use outputs for gradient computation
- If no loss function is provided, the
- Added support for multiple loss outputs from loss function
Full Changelog: v1.0.2...v1.0.3
v1.0.2
What's Changed
- Bugfix: removed
graph
fromLayer
classgraph
was causing issues with nestedModel
objects- Now
_use_autograd_graph
is directly set while compiling theModel
object
Full Changelog: v1.0.1...v1.0.2
v1.0.1
What's Changed (Patches for Pytorch 2.0.0)
- added
grad.setter
toPseudoParameterModule
class
Full Changelog: v1.0.0...v1.0.1
v1.0.0
Documentation: https://analogvnn.readthedocs.io/
AnalogVNN Paper: https://arxiv.org/abs/2210.10048
Cite:
Vivswan Shah, and Nathan Youngblood. "AnalogVNN: A fully modular framework for modeling and optimizing
photonic neural networks." *arXiv preprint arXiv:2210.10048 (2022)*.
Installation:
pip install analogvnn
Full Changelog: https://github.com/Vivswan/AnalogVNN/commits/v1.0.0