A coding experiment from the early days of the LLM boom (2023), now updated with local LLM support.
Note: This documentation was updated in February 2025 using Claude 3.5 Sonnet.
🇪🇪 Eestikeelne versioon (Estonian)
This is a simple tool that uses LLMs to explain concepts through simulated conversations. Originally built with OpenAI's API during the initial ChatGPT excitement, it has now been updated to also work with Ollama for local, offline usage.
Note: Currently, the tool only accepts input in English (concept, role, and audience). Support for other languages may be added in the future.
Given a concept, a specialist role, and a target audience, it generates an explanation in a dialogue format. For example:
- Concept: "black holes"
- Specialist: "astrophysicist"
- Audience: "five-year-old"
The output is formatted in Markdown and includes:
- Basic explanation
- Follow-up questions and answers
- Examples where possible
- Brief etymology and history
- Related concepts
- Works with OpenAI API (original version) or Ollama (new in 2025)
- Command line and basic web interface
- Saves explanations as Markdown files
- Simple search functionality
You need Python 3.6+ and either:
- An OpenAI API key, or
- Ollama installed locally (free, works offline)
git clone https://github.com/klauseduard/concept-explainer.git
cd concept-explainer
pip install -r requirements.txtConfigure your .env file:
# For OpenAI:
LLM_PROVIDER=openai
OPENAI_API_KEY=your-api-key-here
OPENAI_MODEL=gpt-3.5-turbo
OPENAI_TEMPERATURE=0.2
# Or for Ollama:
LLM_PROVIDER=ollama
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=mistral-small
OLLAMA_TEMPERATURE=0.2Basic command format:
python explain.py <concept> <specialist_role> <target_audience> --additional_context <context>Example:
python explain.py "black holes" "astrophysicist" "five-year-old" --additional_context "Assume they know what stars are."Start the web interface:
python web_interface.pyThen open http://localhost:5000 in your browser.
- Requires API key
- Default model:
gpt-3.5-turbo - Also supports:
gpt-3.5-turbo-0125,gpt-4,gpt-4-0125 - Should work with newer models as they become available
- See pricing at: https://openai.com/pricing
- Free, runs locally
- Default model:
mistral-small - Also works with:
llama2,codellama,neural-chat - Should work with any model supported by Ollama
- No API key needed
- Requires some CPU/GPU power
- Range: 0.0 to 2.0 (default: 0.2)
- Lower = more focused
- Higher = more creative
MIT
Klaus-Eduard Runnel - klaus.eduard@gmail.com
Project Link: https://github.com/klauseduard/concept-explainer