Stars
CodonTransformer: The ultimate tool for codon optimization, optimizing DNA sequences for heterologous protein expression across 164 species.
GPU programming related news and material links
Training materials associated with NVIDIA's CUDA Training Series (www.olcf.ornl.gov/cuda-training-series/)
High Quality Resources on GPU Programming/Architecture
DSPy: The framework for programming—not prompting—foundation models
🔥 Turn entire websites into LLM-ready markdown or structured data. Scrape, crawl and extract with a single API.
[ACL2024] T-Eval: Evaluating Tool Utilization Capability of Large Language Models Step by Step
A framework for few-shot evaluation of language models.
Utilities intended for use with Llama models.
TensorHue is a Python library that allows you to visualize tensors right in your console, making understanding and debugging tensor contents easier.
A programming framework for agentic AI 🤖
Toolkit for fine-tuning, ablating and unit-testing open-source LLMs.
Set of tools to assess and improve LLM security.
llama3 implementation one matrix multiplication at a time
QLoRA: Efficient Finetuning of Quantized LLMs
Accessible large language models via k-bit quantization for PyTorch.
Development repository for the Triton language and compiler
Finetune Llama 3.1, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & Mixtral)
Efficient LLM inference on Slurm clusters using vLLM.
A native PyTorch Library for large model training
Machine Learning Engineering Open Book
Deep learning for dummies. All the practical details and useful utilities that go into working with real models.
Inspect: A framework for large language model evaluations