The implementation of DeBERTa
-
Updated
Sep 29, 2023 - Python
The implementation of DeBERTa
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Implementation of Transformer Model in Tensorflow
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
pretrained BERT model for cyber security text, learned CyberSecurity Knowledge
This repository contains PyTorch implementations of the models from the paper An Empirical Study MIME: MIMicking Emotions for Empathetic Response Generation.
CASPR is a deep learning framework applying transformer architecture to learn and predict from tabular data at scale.
The repo is for the Heart Disease classification project using Transformer Encoders in PyTorch.
Temporary remove unused tokens during training to save ram and speed.
This project aims to implement the Transformer Encoder blocks using various Positional Encoding methods.
Official Pytorch implementation of (Roles and Utilization of Attention Heads in Transformer-based Neural Language Models), ACL 2020
Transformer Encoder with Multiscale Deep Learning for Pain Classification Using Physiological Signals
Vision Transformer Implementation in TensorFlow
Code for the ACL 2019 paper "Observing Dialogue in Therapy: Categorizing and Forecasting Behavioral Codes"
Contextual embedding for text blobs.
PyTorch implementation of RealFormer: Transformer Likes Residual Attention
Transformer OCR is a Optical Character Recognition tookit built for researchers working on both OCR for both Vietnamese and English. This project only focused on variants of vanilla Transformer (Conformer) and Feature Extraction (CNN-based approach).
✨ Solve multi_dimensional multiple knapsack problem using state_of_the_art Reinforcement Learning Algorithms and transformers
Repository for a transformer I coded from scratch and trained on the tiny-shakespeare dataset.
Add a description, image, and links to the transformer-encoder topic page so that developers can more easily learn about it.
To associate your repository with the transformer-encoder topic, visit your repo's landing page and select "manage topics."