A simple web interface for chatting with local LLMs through Ollama.
-
Install Ollama and pull a model:
ollama pull llama3.2 -
Run the app:
pip install flask python app.py
- Persistent chat history (stored locally in SQLite)
- Search across conversations
- Export to markdown
- Change Ollama URL from settings (for remote instances)
- Works with any Ollama model
By default connects to Ollama at localhost:11434. Change this in Settings if running Ollama elsewhere.
MIT