list of efficient attention modules
-
Updated
Aug 23, 2021 - Python
list of efficient attention modules
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
Implementation of "Attention is All You Need" paper
Chatbot using Tensorflow (Model is transformer) ko
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
Synthesizer Self-Attention is a very recent alternative to causal self-attention that has potential benefits by removing this dot product.
An experimental project for autonomous vehicle driving perception with steering angle prediction and semantic segmentation using a combination of UNet, attention and transformers.
The implementation of transformer as presented in the paper "Attention is all you need" from scratch.
This package is a Tensorflow2/Keras implementation for Graph Attention Network embeddings and also provides a Trainable layer for Multihead Graph Attention.
A PyTorch Implementation of PGL-SUM from "Combining Global and Local Attention with Positional Encoding for Video Summarization", Proc. IEEE ISM 2021
A repository for implementations of attention mechanism by PyTorch.
A code deep-dive on one of the key innovations from Deepseek - Multihead Latent Attention (MLA)
Transformer model based on the research paper: "𝗔𝘁𝘁𝗲𝗻𝘁𝗶𝗼𝗻 𝗜𝘀 𝗔𝗹𝗹 𝗬𝗼𝘂 𝗡𝗲𝗲𝗱"
Official implementation of the paper "FedLSF: Federated Local Graph Learning via Specformers" (IEEE DCOSS 2024)
A Transformer Encoder where the embedding size can be down-sized.
A small language model built from scratch in PyTorch utilizing transformers. Trained on WikiText-2 with character- and word-level tokenization. Educational project to explore embeddings, positional encodings, multi-head self-attention, and transformer decoder blocks for text generation.
Deployed locally
3D Printing Extrusion Detection using Multi-Head Attention Model
Add a description, image, and links to the multihead-attention topic page so that developers can more easily learn about it.
To associate your repository with the multihead-attention topic, visit your repo's landing page and select "manage topics."