- Overview
- Why UnisonAI?
- Installation
- Core Components
- Configuration
- Parameter Reference Tables
- Usage Examples
- FAQ
- Contributing And License
UnisonAI is a flexible and extensible Python framework for building, coordinating, and scaling multiple AI agentsโeach powered by the LLM of your choice.
- Agent: For solo, focused tasks or as part of a clan for teamwork.
- Clan: For coordination and distributed problem-solving with multiple agents.
- Tool System: Easily augment agents with custom, pluggable tools (web search, time, APIs, your own logic).
Supports Cohere, Mixtral, Groq, Gemini, Grok, OpenAI, Anthropic, HelpingAI, and any custom model (just extend BaseLLM). UnisonAI is designed for real-world, production-grade multi-agent AI applications.
pip install unisonaifrom unisonai import Agent
from unisonai.llms import Gemini
from unisonai import config
config.set_api_key("gemini", "your-api-key")
agent = Agent(
llm=Gemini(model="gemini-2.0-flash"),
identity="Assistant",
description="A helpful AI assistant"
)
print(agent.unleash(task="Explain quantum computing"))UnisonAI stands out with its unique Agent-to-Agent (A2A) communication architecture, enabling seamless coordination between AI agents as if they were human team members collaborating on complex tasks.
- ๐ Strong Type Validation: All tool parameters validated against
ToolParameterTypeenum before execution - ๐ก๏ธ Enhanced Error Handling: Comprehensive error catching with detailed metadata for debugging
- ๐ Standardized Results: All tools return
ToolResultobjects with success status and metadata
- Complex Research Tasks: Multiple agents gathering, analyzing, and synthesizing information
- Workflow Automation: Coordinated agents handling multi-step business processes
- Content Creation: Specialized agents for research, writing, editing, and publishing
- Data Analysis: Distributed agents processing large datasets with different expertise
| Component | Purpose | Key Features |
|---|---|---|
| Agent | Standalone or clan member agent | Own history, tool integration, configurable LLMs, inter-agent messaging |
| Clan | Multi-agent orchestration | Team management, shared goals, coordinated execution |
| Tool System | Extensible capability framework | Type validation, error handling, standardized results |
from unisonai import Agent
from unisonai.llms import Gemini
from unisonai.tools.memory import MemoryTool
agent = Agent(
llm=Gemini(model="gemini-2.0-flash"),
identity="Research Assistant",
description="An AI assistant with memory capabilities",
tools=[MemoryTool]
)
agent.unleash(task="Store important project details")from unisonai import Agent, Clan
research_agent = Agent(llm=Gemini(), identity="Researcher", task="Gather information")
analysis_agent = Agent(llm=Gemini(), identity="Analyst", task="Analyze findings")
clan = Clan(
clan_name="Research Team",
manager=research_agent,
members=[research_agent, analysis_agent],
goal="Comprehensive market analysis"
)
clan.unleash()from unisonai.tools.tool import BaseTool, Field
from unisonai.tools.types import ToolParameterType
class CalculatorTool(BaseTool):
def __init__(self):
self.name = "calculator"
self.description = "Mathematical operations"
self.params = [
Field(name="operation", field_type=ToolParameterType.STRING, required=True),
Field(name="a", field_type=ToolParameterType.FLOAT, required=True),
Field(name="b", field_type=ToolParameterType.FLOAT, required=True)
]
super().__init__()
def _run(self, operation: str, a: float, b: float) -> float:
return a + b if operation == "add" else a * bfrom unisonai import config
# Method 1: Configuration system
config.set_api_key("gemini", "your-key")
config.set_api_key("openai", "your-key")
# Method 2: Environment variables
export GEMINI_API_KEY="your-key"
export OPENAI_API_KEY="your-key"
# Method 3: Direct LLM initialization
llm = Gemini(api_key="your-key")- Quick Start Guide - 5-minute setup guide
- Installation - Detailed installation options
- API Reference - Complete API documentation
- Architecture Guide - System design and patterns
- Usage Guidelines - Best practices and patterns
- Tool System Guide - Custom tool creation and validation
- Parameter Reference - Complete parameter documentation
- Basic Examples - Simple agent patterns
- Advanced Examples - Multi-agent coordination
- Tool Examples - Custom tool implementations
What is UnisonAI?
Python framework for building and orchestrating AI agents with A2A communication.When should I use a Clan?
For complex, multi-step tasks requiring specialized agents working together.Can I add custom LLMs?
Yes! ExtendBaseLLM class to integrate any model provider.
What are tools?
Reusable components that extend agent capabilities (web search, APIs, custom logic).How do I manage API keys?
Use config system, environment variables, or pass directly to LLMs.Founder: Anant Sharma (E5Anant)[https://github.com/E5Anant]
PRs and issues welcome! See our Contributing Guide.
Open Issues โข Submit PRs โข Suggest Features

