Slimmed, cleaned and fine-tuned oh-my-opencode fork, consumes much less tokens
-
Updated
Mar 11, 2026 - TypeScript
Slimmed, cleaned and fine-tuned oh-my-opencode fork, consumes much less tokens
🦖 X—LLM: Cutting Edge & Easy LLM Finetuning
A Python deep learning framework with lazy evaluation, automatic differentiation, and a PyTorch-like API. Features include neural network modules, data loading, training utilities, model serving, and integrations with MLflow, W&B, ONNX, and Jupyter.
⚡ Sub-200ms RAG API built in Rust — document ingestion, Milvus vector search, Jina AI local embeddings, and LLM streaming in a single async binary. Powered by Cerebras, Groq, and more.
PHP library for interacting with AI platform provider.
A robust Node.js proxy server that automatically rotates API keys for Gemini and OpenAI APIs when rate limits (429 errors) are encountered. Built with zero dependencies and comprehensive logging.
Ultra-fast, customizable AI voice dictation in any active app on Windows (MacOS and Linux coming soon)
A tool keep tabs on your Cerebras Code usage limits, in real time
🦖 X—LLM: Simple & Cutting Edge LLM Finetuning
AI-powered geopolitical news intelligence platform. Ingests 100K+ daily events from GDELT, stores in MotherDuck (DuckDB), orchestrates with Dagster, and features an AI chat interface with Text-to-SQL. Full data engineering stack at $0/month.
This repository features an example of how to utilize the xllm library. Included is a solution for a common type of assessment given to LLM engineers
A solution that could prioritize patients based on urgency, reducing wait times and ensuring those who need immediate care.
Matrix decomposition and multiplication on the Cerebras Wafer-Scale Engine (WSE) architecture
🚀 MCP Gateway with Semantic Routing — One API for all your MCP tools. Natural language in → right tool executed. Blazing fast (Cerebras) + always reliable (multi-LLM fallback).
Add a description, image, and links to the cerebras topic page so that developers can more easily learn about it.
To associate your repository with the cerebras topic, visit your repo's landing page and select "manage topics."