list of efficient attention modules
-
Updated
Aug 23, 2021 - Python
list of efficient attention modules
Absolutely amazing SOTA Google Colab (Jupyter) Notebooks for creating/training SOTA Music AI models and for generating music with Transformer technology (Google XLNet/Transformer-XL)
Symbolic music generation taking inspiration from NLP and human composition process
reformer-pytorch中文版本,简单高效的生成模型。类似GPT2的效果
An adaptation of Reformer: The Efficient Transformer for text-to-speech task.
A dedicated convenient repo for different Music Transformers implementations (Reformer/XTransformer/Sinkhorn/etc)
Natural Language Generation using Reformer is a Transformer model for longer sequences
Decent and capable Music AI implementation based on the SOTA Google Reformer transformer and code/colab.
NLP Code Snippets and Conference related
reformer-pytorch中文版本,简单高效的生成模型。类似GPT2的效果
This repository has code for a chatbot using the reformer model. The model was trained on the Multi-Woz dataset.
An implementation of multiple notable attention mechanisms using TensorFlow 2
Imran Parthib 🚀 Enthusiastic Web Developer and programmer 🌐 Crafting seamless digital experiences with passion and precision. Proudly representing the vibrant spirit of Bangladesh
Scientific Guide AI notebooks is a collection of machine learning and deep learning notebooks prepared by Salem Messoud.
Grammatical Error Correction at the character level using Reformers.
Natural Language Processing
Add a description, image, and links to the reformer topic page so that developers can more easily learn about it.
To associate your repository with the reformer topic, visit your repo's landing page and select "manage topics."