This document describes the backend implementation for Agent Nexus, which provides API endpoints and WebSocket connections for multi-agent AI collaboration.
- Next.js 16 with App Router
- Prisma ORM with SQLite database (easily upgradeable to PostgreSQL)
- Custom Node.js Server with WebSocket support
- Three AI Provider Integrations: Ollama (local), OpenAI, Google Gemini
β Real-time agent responses via WebSocket β Multiple AI provider support (Ollama, OpenAI, Gemini) β Database persistence for agents, conversations, and messages β Streaming responses for better UX β RESTful API endpoints β Type-safe with TypeScript
Already done! Dependencies are installed via:
npm installCopy .env.example to .env.local and add your API keys:
# Database
DATABASE_URL="file:./dev.db"
# OpenAI (optional - only needed if using OpenAI models)
OPENAI_API_KEY="sk-your-key-here"
# Google Gemini (optional - only needed if using Gemini models)
GOOGLE_API_KEY="your-gemini-key-here"
# Ollama (local - default configuration)
OLLAMA_BASE_URL="http://localhost:11434"Database is already initialized! If you need to reset:
DATABASE_URL="file:./dev.db" npx prisma migrate dev
DATABASE_URL="file:./dev.db" npx prisma db seedDATABASE_URL="file:./dev.db" npm run devServer will start on:
- HTTP: http://localhost:3001
- WebSocket: ws://localhost:3001
List all active agents.
Response:
[
{
"id": "uuid",
"identityName": "Llama 3 Assistant",
"identityEmoji": "π¦",
"workspace": "local",
"model": "llama3",
"provider": "ollama",
"description": "Fast local Llama 3 model via Ollama",
"status": "active"
}
]Create a new agent.
Request:
{
"name": "My Agent",
"description": "Custom agent description",
"provider": "ollama",
"model": "llama3",
"emoji": "π€",
"workspace": "default"
}Invoke an agent with a message.
Request:
{
"agentId": "uuid",
"message": "Hello, how can you help me?",
"conversationId": "optional-uuid"
}Response:
{
"success": true,
"response": "Agent response text...",
"conversationId": "uuid",
"messageId": "uuid"
}Connect to ws://localhost:3001 to receive real-time updates.
{
"type": "connection",
"status": "connected",
"timestamp": "2024-01-28T..."
}{
"type": "neural-event",
"kind": "agent-response",
"data": {
"agentId": "uuid",
"agentName": "Agent Name",
"chunk": "Response chunk...",
"conversationId": "uuid"
}
}{
"type": "neural-event",
"kind": "agent-response-complete",
"data": {
"agentId": "uuid",
"agentName": "Agent Name",
"response": "Complete response text",
"conversationId": "uuid",
"messageId": "uuid"
}
}- Install Ollama: https://ollama.ai
- Start Ollama server:
ollama serve
- Pull models:
ollama pull llama3 ollama pull codellama
- Get API key from: https://platform.openai.com/api-keys
- Add to
.env.local:OPENAI_API_KEY="sk-your-key-here"
- Get API key from: https://makersuite.google.com/app/apikey
- Add to
.env.local:GOOGLE_API_KEY="your-gemini-key-here"
id: UUID primary keyname: Agent display namedescription: Optional descriptionprovider: "ollama" | "openai" | "gemini"model: Model name (e.g., "llama3", "gpt-4")emoji: Display emojiworkspace: Workspace identifierstatus: "active" | "paused" | "error"config: JSON string for provider-specific config
id: UUID primary keytitle: Optional conversation title
id: UUID primary keyconversationId: Foreign key to ConversationagentId: Foreign key to Agent (nullable)role: "user" | "agent"content: Message texttype: "message" | "thought" | "action"
npx prisma studioDATABASE_URL="file:./dev.db" npx prisma migrate resetDATABASE_URL="file:./dev.db" npx prisma migrate dev --name migration_namemy-app/
βββ app/
β βββ api/
β β βββ agents/route.ts # Agents CRUD
β β βββ invoke/route.ts # Agent invocation
β βββ [pages] # Frontend pages
βββ lib/
β βββ db/
β β βββ prisma.ts # Prisma client singleton
β βββ agents/
β βββ types.ts # TypeScript interfaces
β βββ provider-factory.ts # Provider factory
β βββ providers/
β βββ ollama.ts # Ollama integration
β βββ openai.ts # OpenAI integration
β βββ gemini.ts # Gemini integration
βββ prisma/
β βββ schema.prisma # Database schema
β βββ seed.ts # Database seeding
βββ server.ts # Custom Next.js server with WebSocket
βββ .env.local # Environment variables (not committed)
curl http://localhost:3001/api/agentscurl -X POST http://localhost:3001/api/invoke \
-H "Content-Type: application/json" \
-d '{
"agentId": "your-agent-id",
"message": "What is the meaning of life?"
}'const ws = new WebSocket('ws://localhost:3001');
ws.onopen = () => {
console.log('Connected');
};
ws.onmessage = (event) => {
console.log('Received:', JSON.parse(event.data));
};npm run buildDATABASE_URL="your-production-db-url" npm start-
Update
prisma/schema.prisma:datasource db { provider = "postgresql" url = env("DATABASE_URL") }
-
Set PostgreSQL connection string:
DATABASE_URL="postgresql://user:password@host:5432/database" -
Run migrations:
npx prisma migrate deploy
lsof -i :3001
kill -9 <PID>DATABASE_URL="file:./dev.db" npx prisma generate- Ensure custom server is running (not
next dev) - Check that
npm run devusestsx server.ts - Verify no firewall blocking port 3001
- Ollama: Ensure
ollama serveis running and model is pulled - OpenAI: Verify API key is valid and has credits
- Gemini: Verify API key is valid
For issues or questions, check:
- Frontend integration:
/app/meeting/page.tsx - Provider implementations:
/lib/agents/providers/ - Database schema:
/prisma/schema.prisma