Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
-
Updated
Oct 24, 2024 - Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
An Implementation of Transformer (Attention Is All You Need) in DyNet
Encoder-Decoder model for Semantic Role Labeling
Handwriting Trajectory Recovery using End-to-End Deep Encoder-Decoder Network, ICPR 2018.
Established a deep learning model which can translate English words/sentences into their corresponding French translations.
The proposed framework to retrieve the continuous chunk-level emotions via emo-rankers for Seq2Seq SER
Sequence to Sequence Transformer implementation in order to train a model to translate over Cap-verdian criole to English.
Grounded Sequence-to-Sequence Transduction Team at JSALT 2018
This repository shows the implementation of the paper Neural Machine Translation by Jointly Learning to Align and Translate
Successfully developed an encoder-decoder based sequence to sequence (Seq2Seq) model which can summarize the entire text of an Indian news summary into a short paragraph with limited number of words.
A concise summary generator for Amazon product reviews built using Transformers which maintains the original semantic essence and user sentiment
Successfully established a neural machine translation model using sequence to sequence modeling which can successfully translate English sentences to their corresponding German translations.
Sentiment analysis on the IMDB dataset using Bag of Words models (Unigram, Bigram, Trigram, Bigram with TF-IDF) and Sequence to Sequence models (one-hot vectors, word embeddings, pretrained embeddings like GloVe, and transformers with positional embeddings).
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
Add a description, image, and links to the sequence-to-sequence-models topic page so that developers can more easily learn about it.
To associate your repository with the sequence-to-sequence-models topic, visit your repo's landing page and select "manage topics."