中文对话0.2B小模型(ChatLM-Chinese-0.2B),开源所有数据集来源、数据清洗、tokenizer训练、模型预训练、SFT指令微调、RLHF优化等流程的全部代码。支持下游任务sft微调,给出三元组信息抽取微调示例。
-
Updated
Apr 20, 2024 - Python
中文对话0.2B小模型(ChatLM-Chinese-0.2B),开源所有数据集来源、数据清洗、tokenizer训练、模型预训练、SFT指令微调、RLHF优化等流程的全部代码。支持下游任务sft微调,给出三元组信息抽取微调示例。
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
The "LLM Projects Archive" is a centralized GitHub repository, offering a diverse collection of Language Model Models projects. A valuable resource for researchers, developers, and enthusiasts, it showcases the latest advancements and applications in the realm of LLMs. Explore and contribute to the dynamic landscape of language model projects.
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
A extension of Transformers library to include T5ForSequenceClassification class.
Abstractive text summarization by fine-tuning seq2seq models.
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" (SPNLP@ACL2022)
Code Implementation for "NASH: A Simple Unified Framework of Structured Pruning for Accelerating Encoder-Decoder Language Models" (EMNLP 2023)
Official repository of Generating Multiple-Length Summaries via Reinforcement Learning for Unsupervised Sentence Summarization [EMNLP'22 Findings]
T5 Fine-tuning on SQuAD Dataset for Question Generation
T5 (Text-to-Text Transformer Model) for Farsi.
A enhanced Open Dialogue Context Generator supported by General Language Model Pretraining with Autoregressive Blank Infilling
A repository for my Diploma Thesis; "Semantic communications framework with Transformer based models"
A python script to fine-tune pre-trained models for text generation
LongT5-based model pre-trained on a large amount of unlabeled Vietnamese news texts and fine-tuned with ViMS and VMDS collections
Creates summary for YouTube video based on subtitles
Building this project to generate MCQ Questions from any type of text and generate answers and distractors for it.
Add a description, image, and links to the t5-model topic page so that developers can more easily learn about it.
To associate your repository with the t5-model topic, visit your repo's landing page and select "manage topics."