An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
-
Updated
May 29, 2025 - Python
An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
InternLM-XComposer2.5-OmniLive: A Comprehensive Multimodal System for Long-term Streaming Video and Audio Interactions
[ICLR 2025] Alignment Data Synthesis from Scratch by Prompting Aligned LLMs with Nothing. Your efficient and high-quality synthetic data generation pipeline!
✨ A synthetic dataset generation framework that produces diverse coding questions and verifiable solutions - all in one framwork
[EMNLP 2023] Lion: Adversarial Distillation of Proprietary Large Language Models
The offical realization of InstructERC
Closed-Loop Supervised Fine-Tuning of Tokenized Traffic Models. CVPR Oral 2025.
LogLLM: Log-based Anomaly Detection Using Large Language Models (system log anomaly detection)
[ACL-25] We introduce ScaleQuest, a scalable, novel and cost-effective data synthesis method to unleash the reasoning capability of LLMs.
使用LLaMA-Factory微调多模态大语言模型的示例代码 Demo of Finetuning Multimodal LLM with LLaMA-Factory
[NeurIPS 2024 Main Track] Code for the paper titled "Instruction Tuning With Loss Over Instructions"
[ACL 2024] Learning to Edit: Aligning LLMs with Knowledge Editing
Official repository of "Inst-IT: Boosting Multimodal Instance Understanding via Explicit Visual Prompt Instruction Tuning"
Official implementation for "Diffusion Instruction Tuning"
EMNLP'2024: Knowledge Verification to Nip Hallucination in the Bud
[AAAI 2025]Automatically Generating Numerous Context-Driven SFT Data for LLMs across Diverse Granularity
Code for Paper (Preserving Diversity in Supervised Fine-tuning of Large Language Models)
[ACL 2025] Instruction-Tuning Data Synthesis from Scratch via Web Reconstruction
Fine tune Large Language Model on Mathematic dataset
This project streamlines the fine-tuning process, enabling you to leverage Llama-2's capabilities for your own projects.
Add a description, image, and links to the supervised-finetuning topic page so that developers can more easily learn about it.
To associate your repository with the supervised-finetuning topic, visit your repo's landing page and select "manage topics."