Easily fine-tune, evaluate and deploy gpt-oss, Qwen3, DeepSeek-R1, or any open source LLM / VLM!
-
Updated
Dec 2, 2025 - Python
Easily fine-tune, evaluate and deploy gpt-oss, Qwen3, DeepSeek-R1, or any open source LLM / VLM!
Run ChatGPT OSS, Groke 2 Locally — GGUF Loader with its floating button, ai Models | Open Source & Offline
Local deployment of gpt-oss-20b model in AWS EC2 instance.
agentsculptor is an experimental AI-powered development agent designed to analyze, refactor, and extend Python projects automatically. It uses an OpenAI-like planner–executor loop on top of a vLLM backend, combining project context analysis, structured tool calls, and iterative refinement. It has only been tested with gpt-oss-120b via vLLM.
This project implements a text classification system powered by Large Language Models (LLMs) running locally. The goal is to leverage the capabilities of modern LLMs to automatically categorize and label text data without relying on external APIs or manual human labeling, ensuring privacy, autonomy, and efficiency in text processing tasks.
No Hopper architecture (RTX 5090, etc.) required! <16 GB VRAM, Windows.
A sophisticated red-teaming agent built with LangGraph and Ollama to probe OpenAI's GPT-OSS-20B model for vulnerabilities and harmful behaviors. (Specifically built for the OpenAI Open Model Hackathon)
This repository is built based on our recent Paper
InstaCheck is a Chrome extension that helps users verify the authenticity of claims in Instagram Reels in real time.
Load testing openai/gpt-oss-20b with vLLM and Docker
FastAPI OSS LLM Gateway — OpenAI-compatible API for local backends (Ollama, vLLM, LM Studio) with file-aware prompts.
"SQL Where Clause" Extraction from User Chat Query with LLM
Kickstart your codex mcp software integration
Eve is a real-time, multilingual voice assistant built on LiveKit. It listens, transcribes with Whisper, thinks with GPT-OSS 20B, and responds using ElevenLabs TTS—low-latency, natural, and noise-free.
AI-powered GitHub repository assistant using Model-Context-Protocol (MCP). Features a local agent that queries GitHub APIs through a Dockerized MCP Gateway, delivering intelligent, context-aware answers about any repository.
TAKON is a RAG-based Q&A app that lets you interact with your own documents using GPT-OSS-20B. Built with Streamlit for an intuitive browser interface.
AI Agent-Based Nutrition Advisor
Add a description, image, and links to the gpt-oss-20b topic page so that developers can more easily learn about it.
To associate your repository with the gpt-oss-20b topic, visit your repo's landing page and select "manage topics."