ContextGem: Effortless LLM extraction from documents
-
Updated
Aug 26, 2025 - Python
ContextGem: Effortless LLM extraction from documents
Document Summarization App using large language model (LLM) and Langchain framework. Used a pre-trained T5 model and its tokenizer from Hugging Face Transformers library. Created a summarization pipeline to generate summary using model.
A type-safe graph execution framework built on top of OpenLit for LLM pipelines
Compose, train and test fast LLM routers
Sage – Prompt-Based Data Generation & Annotation Platform
CLI tool for LLM prompt pipelines. Reusable. Shareable. Scriptable.
Add a description, image, and links to the llm-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the llm-pipeline topic, visit your repo's landing page and select "manage topics."