A modern BYOK desktop chatbot application that provides a unified interface for interacting with multiple AI providers. Using LiteLLM, you can access various providers; including OpenAI, Anthropic, Google, and OpenRouter.
PyQtChat.mp4
- OpenAI: GPT-4o, GPT-4o-mini, GPT-4.1 series
- Anthropic: Claude 3.5 Sonnet, Claude Opus 4, Claude Sonnet 4
- Google: Gemini 2.5 Pro/Flash Preview
- OpenRouter: Access to 100+ models through a single API
- Custom Models: Add your own LiteLLM-compatible models
- Multi-Tab Support: Manage multiple conversations simultaneously
- Message Editing: Edit and resend previous messages
- Markdown Rendering: Rich text formatting with syntax highlighting
- Cost Tracking: Real-time cost calculation per message and session
- Auto-scroll: Smooth conversation flow
- Timestamps: Optional message timestamps
- Dark/Light Mode: Toggle between themes
- Adjustable Font Size: Scale interface text
- Responsive Layout: Resizable panels and windows
- Window State Persistence: Remembers size and position
- Multiple Formats: JSON, Markdown, Plain Text
- Conversation History: Save and restore chat sessions
- Cross-Platform: Compatible export formats
- Settings Dialog: Centralized configuration management
- Environment Variables: Support for
.envfiles - API Key Management: Storage with QSettings
- Custom API Endpoints: Support for self-hosted models
- Python 3.10 or higher
- Windows, macOS, or Linux (There might be problems with macOS or Linux options as this project was made on Windows and not really tested on other platforms)
- Preferably use a Conda environment
-
Clone the repository:
git clone <repository-url> cd ai_chatbot
-
Install dependencies:
pip install -r requirements.txt
-
Run the application:
python main.py
-
Configure API Keys:
- Open
Settings -> Preferences -> API Keys - Add your API keys for desired providers
- Alternatively, create a
.envfile with:also works for the prebuilt executableOPENAI_API_KEY=your_openai_key ANTHROPIC_API_KEY=your_anthropic_key GEMINI_API_KEY=your_google_key OPENROUTER_API_KEY=your_openrouter_key
- Open
-
Select a Model:
- Choose a provider from the dropdown
- Select your preferred model
- View cost information in the control panel
-
Start Chatting:
- Type your message in the input field
- Press Enter or click "Send"
- Enjoy the conversation!
Access via Settings → Preferences or Ctrl+,:
- Configure API keys for all supported providers
- Environment variables take precedence over stored keys
- Add custom LiteLLM-compatible models
- Format:
provider/model-name - Examples:
openrouter/microsoft/wizardlm-2-8x22b ollama/llama3 together_ai/meta-llama/Meta-Llama-3-8B-Instruct
- Dark Mode: Toggle between light and dark themes
- Font Size: Adjust interface font size
- Auto Scroll: Automatically scroll to new messages
- Show Timestamps: Display message timestamps
- Max Tokens: Set maximum response length (1-10000)
- Click "New Chat" in the sidebar
- Or use
Ctrl+N(if implemented)
- Rename: Right-click tab → Rename or press
F2 - Delete: Right-click tab → Delete or press
Ctrl+D - Clear: Use "Clear Chat" button (confirms before clearing)
- Edit: Click "Edit" on any user message
- Resend: Click "Resend" to retry with same message
- Copy: Click "Copy" to copy message to clipboard
- Export:
File → Export ChatorCtrl+E - Import:
File → Import ChatorCtrl+I - Supported formats: JSON, Markdown, Text
Ctrl+,: Open SettingsCtrl+T: Toggle ThemeCtrl+E: Export Current ChatCtrl+I: Import ChatCtrl+D: Delete Current ChatCtrl+N: Create a New ChatF2: Rename Current ChatCtrl+Q: Exit ApplicationEnter: Send Message
- Real-time cost calculation per message
- Session total displayed in info bar
- Supports all major model pricing
- Estimates based on LiteLLM cost database
dev_build.batbuild.batThe executable will be created in dist/ directory.
ai_chatbot/
├── main.py # Application entry point
├── requirements.txt # Python dependencies
├── build.bat # Production build script
├── dev_build.bat # Development build script
├── core/ # Core application logic
│ ├── app_setup.py # Application initialization
│ ├── chat_worker.py # AI chat worker thread
│ ├── models.py # Model management
│ └── settings.py # Settings management
├── ui/ # User interface components
│ ├── main_window.py # Main application window
│ ├── chat_tab.py # Individual chat tab
│ ├── chat_message.py # Message display widget
│ └── settings_dialog.py # Settings configuration dialog
├── utils/ # Utility modules
│ ├── cost_tracker.py # Cost calculation utilities
│ ├── export.py # Import/export functionality
│ ├── logger.py # Logging configuration
│ ├── style_manager.py # Theme and styling
│ └── window_utils.py # Window state management
├── resources/ # Application resources
│ ├── icon.ico # Application icon
│ ├── styles.css # Light theme styles
│ └── styles_dark.css # Dark theme styles
└── logs/ # Application logs (auto-created)
- Verify API keys in Settings
- Try to create an
.envfile with relevant variables
- Verify model name in provider documentation
- Check if model requires special access
- Try a different model from the same provider
- With LiteLLM, OpenRouter models are missing cost data
- Some custom models may also not have cost data
- Try adjusting font size in Settings
- Toggle between light/dark mode
- Restart application after theme changes
- Logs are stored in
logs/directory - Daily log rotation with format:
chatbot_YYYYMMDD.log - Log level: INFO (console and file)
- If using prebuilt executable, logs will be found at
%Temp%/_MEIxxxxxx/logson Windows, wherexxxxxxis a random number.
This project is licensed under the MIT License - see the LICENSE file for details.
- Create an issue for bug reports
- Check existing issues before creating new ones
- Include logs and system information in bug reports
Made with ❤️ by Starosti

