- improved the downsampling method of output variables of the Network.run method - now, an average over all simulated samples is calculated for the downsampling
- updated documentation and use examples to account for the recent changes to the Network.add_diffeq_node and Network.run methods
- updated pytests to account for new downsampling method
- added a cutoff keyword argument to the Network.run function that allows to define the number of initial steps that should be disregarded in the simulation results
- updated PyPI config for uploading new versions
- added option to define spiking networks which contain the spike reset mechanism in their governing differential equations
- changed the name of the keyword argument spike_def of the method Network.add_diffeq_node to reset_var
- added a keyword argument reset to the Network.add_diffeq_node that controls whether the spike reset mechanism of RectiPy should be used for a given node or whether it has an intrinsic spike reset mechanism
- added a QIF neuron template that uses an intrinsic spike reset mechanism
- updated the pytest library to account for the changes
- dropped official support for python 3.6 and added support for python 3.10
- updated readthedocs configuration file
- added new node classes MemoryNode and MemoryNet that allow to implement delay coupling in networks
- renamed the FunctionNode class into InstantNode to better set it apart from other node types that have some type of intrinsic memory
- improved method for detaching tensors via Network.detach() - provides more control over which tensors to detach now.
- debugged FeedbackNetwork.compile() method to work with multiple calls of the method
- introduced edge masking as a new feature. Can be used by passing a mask array to the keyword argument mask of Network.add_edge
- added a class FeedbackNetwork that allows to add feedback connections between nodes in a network. Intended use: First, build the feedforward network structure that maps from input to output. Then add feedback edges, marking them via the keyword argument feedback=True of the FeedbackNetwork.add_edge method.
- both feedforward and feedback edges can be trained
- Re-worked the backpropagation through time and recursive least squares training methods to work with feedback weights
- updated the readthedocs documentation
- added new pytests for the different optimization methods: Ridge regression, backpropagation, and recursive least squares
- added a class MultiSpikeNode that allows to define multiple spiking variables within a single differential equation-based node.
- updated documentation of user interfaces
- updated use examples for user interfaces
- removed a minor bug in nodes.py that led to issues with retrieving variable indices from single-neuron nodes
- removed a bug in nodes.py where, in some cases, the state vector y was not moved to the correct device after state resets
- debugged the Network.reset() method. Properly resets the network to the passed state dictionary now, if provided.
- improved the RateNet.reset() method. The method can now handle both numpy arrays as well as tensors as input.
- added a new method Network.clear() that removes all nodes and edges from a rectipy.Network instance
- added new keyword arguments to Network.detach() that allow to customize the behavior of the method with respect to which tensors are going to be detached and whether they should require a gradient after detachment or not
- added a new method Network.set_var() that allows to change the values of variables in the network instance
- added a new method Node.set_param() that allows to change the values of parameters of the node
- improved documentation of the methods of rectipy.Network
- Changed user interface: rectipy.Network is now initialized as an empty graph. Nodes and edges can be added to it afternwards.
- Added new modules: rectipy.nodes and rectipy.edges provide different classes for nodes and edges that can be added to rectipy.Network instances
- Altered training functionalities: rectipy.Network now features two methods for parameter optimization: Network.fit_ridge for Ridge regression-based training of a set of readout weights, and Network.fit_bptt for gradient descent based on backpropagation through time.
- Implemented truncated backpropagation through time
- Implemented gradient surrogates for spiking neural networks
- New functionalities of the rectipy.Observer module: All recorded outputs can now be returned either as a list of torch.Tensor objects, as numpy arrays, or as pandas.DataFrame objects.
- Added convenience functions on rectipy.Network for adding and getting nodes, edges and network variables.
- Improved integration of rectipy.Network with pytorch parameter optimization methods by adding high-level methods for (i) detaching all state-variables from the current graph for gradient computation, and (ii) resetting the state of the entire network.
- Updated unit tests and documentation to work with the above described changes
- Updated PyRates interface to work with recent changes to the pyrates.CircuitTemplate.add_edges_from_matrix method
- added a new IK neuron template with biexponential synaptic dynamics
- improved layout of the readthedocs documentation website
- changed the readout function to use the SGDClassifier instead of Ridge from sklearn
- cleaned some code after model deployment changes made in 0.9.1
- debugged the from_template initialization methods
- minor bug fix of faulty normalization of input weights in utility.input_connections
- resolved issues with the model deployment on a certain device. Instead of providing the device ("cpu" or "cuda") to the Network.compile method, it is to be provided during initialization now
- debugged network initialization method Network.from_template
- debugged global recovery variable definition of izhikevich model template
- debugged simulation test
- added a new rectipy.Network initialization method: Network.from_template that allows to initialize Network instances from pyrates.CircuitTemplate instances. This way, the user has full control over the construction of the network template.
- added a use example for rectipy-torch integration
- added a function for matrix normalization to utility
- added the izhikevich neuron model as a template
- added an izhikevich neuron with global recovery variable as a template
- added visualization method rectipy.observer.Observer.matshow that allows to create 2D color-coded plots of multi-dimensional RNN state variables
- simplified alteration of default parameter values during network initialization
- added use example for training and testing via the Network.train and Network.test methods
- added a global coupling constant k to the qif model template
- improved docstrings
- added use example for the LIF neuron model
- new variable views available on the rectipy.Network and rectipy.rnn_layer.RNNLayer classes
Network.__getitem__()
andRNNLayer.__getitem__()
allow to directly access parameters and variables of the RNNLayer instance- integrated the new variable views into the documentation and testing suite
- simplified code for model definitions based on the new variable views
- added use example for the QIF neuron models
- added use example for the leaky-integrator rate neuron model
- added use example gallery skeleton
- added use example for network initialization
- added use example for numerical simulations
- added use example for the observer
- removed bug from SRNNLayer that caused model initialization to fail when no dtype for variales was provided
- removed bug from the sigmoid operator that is part of the leaky_integrator.yaml model definition file
- added .gitignore file
- added model template for LIF neurons
- improved docstrings of the Network class
- added documentation source files for a readthedocs documentation website
- added yaml configuration and config files for readthedocs installation
- added a first use example
- added installation instructions
- added the changelog to the readthedocs website sources
- added a full API section
- renamed the tests module to rectipy_tests to avoid confusion with the PyRates.tests module
- reduced overhead of
InputLayer
andOutputLayer
by making them return instances oftorch.nn.Linear
orrectipy.input_layer.LinearStatic
upon initialization - reduced overhead of
Network.compile
by directly accessing thetorch.Module
instances to create thetorch.Sequential
- improved test library with more extensive testing of
RNNLayer
andNetwork
functionalities
- added new pytests that test the functionalities of the
RNNLayer.record
andRNNLayer.reset
methods - added new pytests that test the initialization functions of
Network
- improved integration of PyRates into RectiPy, by making sure that all PyRates caches are cleared, even if building the network functions fails due to erroneous user inputs
- removed all in-place operations for non-spiking networks
- changed pyrates interface such that vector-field updates are not performed in-place anymore
- only in-place operation left: Spike resetting
- added methods
Network.forward
andNetwork.parameters
that allow the classNetwork
to be embedded in larger network structures. - added method
RNNLayer.reset
as a method that can be used to reset the state vector of the RNN - added new tests for the rnn layer
- debugged
detach
method in rnn layer - debugged issues with in-place operations and autograd
- added a new example for parameter fitting within the RNN layer
- improved documentation
- added pytests for the initialization functions of the rnn layer
- debugged index-finding functions for trainable parameters in the rnn layer
- improved integration of pyrates functions into rnn layer
- added utility function
readout
that allows to train a readout classifier on collected network states and targets - added new gradient descent optimizer options
- added possibility of making an optimizer step only every
x
training steps (gradients will accumulate over these steps)
- renamed the model template package to avoid interference with the pyrates-intrinsic model template package
- added a utility function for the generation of input weight matrices
- added a utility function for winner-takes-all score calculation
- added getitem methods to the
Network
(integer-based indexing, returns layers) andObserver
(string-based indexing, returns recordings) classes - added the possibility to the
Network.train
method to train in epochs - made the
device
argument ofNetwork.compile
optional - ensured that the activation functions of the
OutputLayer
are always applied to the first dimension of the outputs
- ensured that state variable indices in RNN layer use correct data
type (
torch.int64
)
- added pytests for the output layer
- added checks on the correctness of the input arguments for the output layer
- added keyword arguments to the
OutputLayer.__init__()
that are passed on totorch.nn.Linear
iftrainable=True
- added pytests for the input layer
- added a CircleCI config
- added automated execution of all tests via CircleCI upon pushing to github
- added
pytest
to the requirements
- added docstrings to the Network class for all non-private methods
- added docstrings to the Obsever class for all non-private methods
- made
Network.compile
a public method and reduced the number of automatized calls to it byNetwork
(Network.train
,Network.test
andNetwork.run
only callNetwork.compile
themselves if it hasn’t been done before) - added a public property
Network.model
that provides read access to the pytorch model of the network
- added automated pypi releases
- added github workflow for pypi releases
- updated readme
- code structure:
- network class as main user interface
- input, output, and rnn layers as network components
- observer as class for results storage
- model templates package for yaml definition files
- installation instructions