A coding experiment from the early days of the LLM boom (2023), now updated with local LLM support.
Note: This documentation was updated in February 2025 using Claude 3.5 Sonnet.
🇪🇪 Eestikeelne versioon (Estonian)
This is a simple tool that uses LLMs to explain concepts through simulated conversations. Originally built with OpenAI’s API during the initial ChatGPT excitement, it has now been updated to also work with Ollama for local, offline usage.
Note: Currently, the tool only accepts input in English (concept, role, and audience). Support for other languages may be added in the future.
Given a concept, a specialist role, and a target audience, it generates an explanation in a dialogue format. For example:
The output is formatted in Markdown and includes:
You need Python 3.6+ and either:
git clone https://github.com/klauseduard/concept-explainer.git
cd concept-explainer
pip install -r requirements.txt
Configure your .env file:
# For OpenAI:
LLM_PROVIDER=openai
OPENAI_API_KEY=your-api-key-here
OPENAI_MODEL=gpt-3.5-turbo
OPENAI_TEMPERATURE=0.2
# Or for Ollama:
LLM_PROVIDER=ollama
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=mistral-small
OLLAMA_TEMPERATURE=0.2
Basic command format:
python explain.py <concept> <specialist_role> <target_audience> --additional_context <context>
Example:
python explain.py "black holes" "astrophysicist" "five-year-old" --additional_context "Assume they know what stars are."
Start the web interface:
python web_interface.py
Then open http://localhost:5000 in your browser.
gpt-3.5-turbogpt-3.5-turbo-0125, gpt-4, gpt-4-0125mistral-smallllama2, codellama, neural-chatMIT
Klaus-Eduard Runnel - klaus.eduard@gmail.com
Project Link: https://github.com/klauseduard/concept-explainer