A self-hosted AI infrastructure platform combining intelligent routing, workflow automation, and RAG-powered personal assistants.
The Nexus Ecosystem is my personal AI platform that connects multiple LLM providers, automation workflows, and interfaces into a unified system. It powers everything from my portfolio's AI chat to my personal Jarvis assistant on Telegram.
- 🔀 Multi-Provider AI Gateway - Route requests across Groq, Gemini, Claude, OpenRouter with automatic failover
- 🧠 RAG-Powered Chat - Open WebUI with document search and knowledge base
- ⚙️ Workflow Automation - n8n for orchestrating AI workflows and integrations
- 🤖 Personal Assistant (Jarvis) - Telegram bot for voice/text AI interactions
- 🏠 Home Lab Ready - Designed for cloud + local hybrid deployment
┌─────────────────────────────────────────────────────────────────┐
│ NEXUS ECOSYSTEM │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ 🚀 NEXUS │ │ 🧠 OPEN │ │ ⚙️ N8N │ │
│ │ GATEWAY │◄───┤ WEBUI │◄───┤ │ │
│ │ │ │ │ │ │ │
│ │ • Multi-LLM │ │ • Chat UI │ │ • Webhooks │ │
│ │ • Fallback │ │ • RAG Docs │ │ • Workflows │ │
│ │ • Telemetry │ │ • Functions │ │ • Bots │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ │
│ │ │
│ ▼ │
│ ┌───────────────────────────────────────────────────────────┐ │
│ │ ☁️ LLM PROVIDERS: Groq • Gemini • Claude • OpenRouter │ │
│ └───────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
nexus-ecosystem/
├── deploy/
│ └── nexus-stack-docker-compose.yml # Production Docker Compose
├── ECOSYSTEM_STATUS.md # Current deployment status
├── IMPLEMENTATION_PLAN_2026.md # Detailed roadmap
├── JARVIS_FEATURES.md # Personal assistant features
├── PHASE_2_5_GUIDE.md # Configuration guide
├── PHASE_3_HOMELAB_PLAN.md # Home lab setup plan
├── .env.example # Environment template
└── README.md # This file
- Docker & Docker Compose
- A VPS or local server (8GB+ RAM recommended)
- API keys for LLM providers (Groq, Gemini, etc.)
-
Clone this repository:
git clone https://github.com/Ramsesdb/nexus-ecosystem.git cd nexus-ecosystem -
Copy and configure environment variables:
cp .env.example .env # Edit .env with your API keys -
Deploy with Docker Compose:
cd deploy docker-compose -f nexus-stack-docker-compose.yml up -d
| Project | Description |
|---|---|
| Nexus AI Gateway | The multi-provider LLM proxy powering this ecosystem |
| Portfolio | My personal portfolio using this AI stack |
- Implementation Plan - Full roadmap and architecture details
- Phase 2.5 Guide - Post-deployment configuration
- Jarvis Features - Personal assistant capabilities
- Home Lab Plan - Future local deployment strategy
| Component | Technology |
|---|---|
| AI Gateway | Bun + TypeScript |
| Chat UI | Open WebUI |
| Automation | n8n |
| Orchestration | Docker Compose |
| Hosting | Azure VPS + Coolify |
MIT License - feel free to use this as inspiration for your own AI infrastructure!
Ramses Briceño - ramsesdb.tech
- GitHub: @Ramsesdb
- LinkedIn: Ramses Briceño