Lotus T1 Chat is a modern, multi-provider AI chat extension for Visual Studio Code. It supports Deepseek r1, Ollama (local), OpenRouter, and more, with a beautiful chat UI and workspace-specific MCP server management.
- Chat with Deepseek r1, Ollama, OpenRouter, or simulated models
- Switch providers and models on the fly
- Configure API keys and endpoints easily
- Manage custom MCP servers with a dedicated UI panel
- Modern, dark/light theme-aware chat interface
- Concise, context-aware AI answers
- VS Code 1.100.0 or later
- For Ollama: Ollama running locally
- For Deepseek/OpenRouter: API keys from their respective sites
- (Optional) MCP servers for advanced workflows
lotusT1Chat.modelProvider: Select the AI provider (ollama, openRouter, deepseek, simulation)lotusT1Chat.ollamaUrl: URL for local Ollama APIlotusT1Chat.ollamaModel: Default Ollama modellotusT1Chat.openRouterApiKey: OpenRouter API keylotusT1Chat.openRouterModel: Default OpenRouter modellotusT1Chat.apiKey: Deepseek API keylotusT1Chat.simulationModel: Model to simulate in simulation mode
- Open the command palette and run
Open Lotus T1 Chat. - Select your provider and model from the dropdown.
- Configure API keys as needed using the "Configure API Key" button.
- Manage MCP servers with the "MCP Servers" button for advanced workflows.
- Start chatting!
- MCP server management is workspace-specific and requires a folder to be open.
- Some advanced features may require additional configuration.
- Initial release with Deepseek, Ollama, OpenRouter, simulation, and MCP management.
Enjoy Lotus T1 Chat!