Custom MCP Server Support(for Multi-Agent LLM Ecosystem) + NotebookLM Integration #403
maceatwork
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Custom MCP Server Support(for Multi-Agent LLM Ecosystem) + NotebookLM Integration
I need BrowserOS to connect to my NotebookLM (Google's AI research assistant) to query my research notes and sources during browser automation tasks.
CURRENT SETUP:
NotebookLM MCP server running locally via: npx -y notebooklm-mcp@latest
Working integrations with: Antigravity IDE and OpenClaw Personnel Assistant
SSE bridge built at: http://localhost:3000/sse (ready for BrowserOS)
Can add YouTube sources via browser extension
DISCOVERY:
BrowserOS agents cannot add custom MCP servers. We checked:
browser://settings - no MCP configuration UI accessible
Filesystem - no mcp_config.json or similar files
Agent tools - no API to add external MCP endpoints
CURRENT WORKAROUND:
Built cross-agent delegation protocol with Antigravity IDE:
Shared workspace: /Users/????/.openclaw/shared-agent-workspace/
JSON message protocol for agent-to-agent communication
On-demand collaboration when I explicitly request it
FEATURE REQUESTS:
Custom MCP server configuration (SSE endpoints)
Agent-accessible MCP client tools
Built-in NotebookLM integration
PROOF OF DEMAND:
I already have 3 working NotebookLM integrations. The SSE bridge is running and tested. This would be a major competitive advantage vs other AI tools.
FULL TECHNICAL REPORT:
Available in workspace: browseros-notebooklm-integration-report.md
Includes: Phase breakdown, architecture diagrams, protocol specs, config examples
AGENT-TO-AGENT COLLABORATION:
This is a novel workflow worth considering as a BrowserOS feature - enabling agents to delegate tasks to other specialized agents when they hit capability limits.
My Multi-Agent LLM Ecosystem
Current Architecture:
BrowserOS Agent → Uses locally-hosted Qwen3.5 27B (ASUS Z13 Strix Halo via lm studio server that I use as Main Provider for BrowserOS on Mac Mini and Z13) and smaller Qwen3.5 9B llm (via Mac Mini LMStudio that can be also used by BrowserOS on z13 as main provider for simpler task or summarization, mark down files creation) always having 2 locally hosted models for all tools on both machines in same time beside cloud llms
Antigravity IDE Agent → Uses Gemini-3.1-Pro as "Final Boss" controller(because huge 1M context) and for creative image/video generation with Remotion-CLI
OpenClaw Agent → Uses cloud LLMs (Kimi 2.5, Anthropic, Gemini variants) as hub with folder for sharing knowledge, tasks, skills, research data....whatever need
NotebookLM MCP → Running via SSE bridge at localhost:3000/sse
The Vision:
All agents collaborate through delegated tasks based on their strengths:
BrowserOS handles browser automation + web research
Antigravity (Gemini) coordinates complex reasoning tasks and creative task with Remotion-cli and monitor/control all agents because huge 1M context plus have help from Roo-code agent via extension in Antigravity with locally hosted Qwen3.5 as best open source llm for now with vision
OpenClaw manages API integrations and external services
NotebookLM provides persistent knowledge base access
Why Custom MCP Support Is Critical:
Without it, I have to manually bridge agents through file-based protocols. With native MCP support, BrowserOS could:
Query my NotebookLM research directly during web browsing
Delegate complex analysis to other agents automatically
Share context seamlessly across my entire AI ecosystem
Files I Can Provide to developers so they can implement this as standard features to BrowserOS
I have detailed technical documentation ready:
browseros-notebooklm-integration-report.md - Full phase-by-phase breakdown
custom_mcp_config.example.json - Proposed configuration format
agent-protocol.md - Cross-agent communication protocol we built
notebooklm-sse.js - Working SSE bridge source code
All files demonstrate this is technically feasible and user-tested.
Contact
X/Twitter: @macidoniadesign
Happy to collaborate, test beta features, or provide more technical details. This would position BrowserOS as the only browser automation tool with true multi-agent MCP support!
Beta Was this translation helpful? Give feedback.
All reactions