🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 6, 2025 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
Open-source offline translation library written in Python
Large Concept Models: Language modeling in a sentence representation space
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
A family of diffusion models for text-to-audio generation.
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
[ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723
Active Learning for Text Classification in Python
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
Unofficial API Wrapper for Perplexity.ai + Account Generator with Web Interface
Latency and Memory Analysis of Transformer Models for Training and Inference
PromptInject is a framework that assembles prompts in a modular fashion to provide a quantitative analysis of the robustness of LLMs to adversarial prompt attacks. 🏆 Best Paper Awards @ NeurIPS ML Safety Workshop 2022
💬 Chatbot web app + HTTP and Websocket endpoints for LLM inference with the Petals client
PyTorch + HuggingFace code for RetoMaton: "Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval" (ICML 2022), including an implementation of kNN-LM and kNN-MT
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.
DSIR large-scale data selection framework for language model training
On-device LLM Inference Powered by X-Bit Quantization
Python library & framework to build custom translators for the hearing-impaired and translate between Sign Language & Text using Artificial Intelligence.
Train very large language models in Jax.
Add a description, image, and links to the language-models topic page so that developers can more easily learn about it.
To associate your repository with the language-models topic, visit your repo's landing page and select "manage topics."