Implementation of Alphafold 3 in Pytorch
-
Updated
Nov 2, 2024 - Python
Implementation of Alphafold 3 in Pytorch
Implementation of LVSM, SOTA Large View Synthesis with Minimal 3d Inductive Bias, from Adobe Research
PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"
Implementation of MagViT2 Tokenizer in Pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with Sparse Transformers"
Implementation of "PaLM2-VAdapter:" from the multi-modal model paper: "PaLM2-VAdapter: Progressively Aligned Language Model Makes a Strong Vision-language Adapter"
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Implementation of Q-Transformer, Scalable Offline Reinforcement Learning via Autoregressive Q-Functions, out of Google Deepmind
PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"
Implementation of RT1 (Robotic Transformer) in Pytorch
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Implementation of CALM from the paper "LLM Augmented LLMs: Expanding Capabilities through Composition", out of Google Deepmind
An implementation of local windowed attention for language modeling
Implementation of the conditionally routed attention in the CoLT5 architecture, in Pytorch
Enhancing Conditional Image Generation with Explainable Latent Space Manipulation
Add a description, image, and links to the attention-mechanisms topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanisms topic, visit your repo's landing page and select "manage topics."