Flexible and lightweight library for creating prompt templates
-
Updated
Oct 10, 2025 - Python
Flexible and lightweight library for creating prompt templates
vim as a perfect large language models prompts playground
Prompt templating and versioning using jinja2 and litellm 🔥
Schema-first AI analysis CLI that transforms messy data into structured insights. Define your output format, get guaranteed JSON results from any source. Combines OpenAI models with multi-tool orchestration (Code Interpreter, File Search, Web Search, MCP) for AI-powered data synthesis.
This repository, forked from Packt Publishing, serves as a comprehensive guide to LangChain and LLMs, encompassing all the resources and knowledge gained from the on-demand course
Programmatic prompt template for Python.
This project utilizes the TinyLLaMA foundation model for generating blog content, modified through prompt templates to guide and structure the outputs effectively. To optimize cost efficiency, the model is stored and accessed locally instead of relying on paid APIs, significantly reducing ongoing expenses—since storage costs are considerably lower
Youtube-Transcript-Interpreter is a Langchain model which uses OpenAI API and streamlit.io to display query results based on the youtube video transcript.
"Will you be able to win?" This is the game that challenges Human to face LLM models in a variety of games, from guessing to association/pattern games... challenge yourself!
A Simple LLM application with chat models and prompt templates
Describe each file of a python project by asking a Generative AI model to generate a natural language explanation of each file
A QandA Application built using LangChain and OpenAI Da Vinci model.
Successfully developed an LLM application which generates a summary, a list of citations and references and response to a user's query based on the research paper's content.
Add a description, image, and links to the prompt-template topic page so that developers can more easily learn about it.
To associate your repository with the prompt-template topic, visit your repo's landing page and select "manage topics."