TorchRecurrent is a PyTorch-compatible collection of recurrent neural network cells and layers from across the research literature. It aims to provide a unified, flexible interface that feels like native PyTorch while exposing more customization options.
pip install torchrecurrent
Coming soon to conda-forge as well!
- 🔄 30+ recurrent cells (e.g.
LSTMCell
,GRUCell
, and many specialized variants). - 🏗️ 30+ recurrent layers (e.g.
LSTM
,GRU
, and counterparts for each cell). - 🧩 Unified API — all cells/layers follow the PyTorch interface but add extra options for initialization and customization.
- 📚 Comprehensive documentation including API reference and a catalog of published models.
👉 Full model catalog: torchrecurrent Models
import torch
from torchrecurrent import MGU #minimal gated unit
# sequence: (time_steps, batch, input_size)
inp = torch.randn(5, 3, 10)
# initialize a MGU with hidden_size=20
rnn = MGU(input_size=10, hidden_size=20, num_layers=3)
# forward pass
out, hidden = rnn(inp)
print(out.shape) # (time_steps, batch, hidden_size)
LuxRecurrentLayers.jl: Provides recurrent layers for Lux.jl in Julia.
RecurrentLayers.jl: Provides recurrent layers for Flux.jl in Julia.
ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained recurrent neural networks.
This project’s own code is distributed under the MIT License (see LICENSE). The primary intent of this software is academic research.
Some cells are re-implementations of published methods that carry their own licenses:
- NASCell: originally available under Apache 2.0 — see LICENSE-Apache2.0.txt.
Please consult each of those licenses for your obligations when using this code in commercial or closed-source settings.
⚠️ Disclaimer: TorchRecurrent is an independent project and is not affiliated with the PyTorch project or Meta AI. The name reflects compatibility with PyTorch, not any official endorsement.