Skip to content

The UnisonAI Multi-Agent Framework (A2A) provides a flexible and extensible environment for creating and coordinating multiple autonomous AI agents. UnisonAI is designed with flexibility and scalability in mind.

License

Notifications You must be signed in to change notification settings

UnisonaiOrg/UnisonAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

UnisonAI Banner

Table of Contents

UnisonAI

Orchestrate the Future of Multi-Agent AI

Stars License Python Version


Overview

UnisonAI is a flexible and extensible Python framework for building, coordinating, and scaling multiple AI agentsโ€”each powered by the LLM of your choice.

  • Agent: For solo, focused tasks or as part of a clan for teamwork.
  • Clan: For coordination and distributed problem-solving with multiple agents.
  • Tool System: Easily augment agents with custom, pluggable tools (web search, time, APIs, your own logic).

Supports Cohere, Mixtral, Groq, Gemini, Grok, OpenAI, Anthropic, HelpingAI, and any custom model (just extend BaseLLM). UnisonAI is designed for real-world, production-grade multi-agent AI applications.


Quick Start

pip install unisonai
from unisonai import Agent
from unisonai.llms import Gemini
from unisonai import config

config.set_api_key("gemini", "your-api-key")
agent = Agent(
    llm=Gemini(model="gemini-2.0-flash"),
    identity="Assistant",
    description="A helpful AI assistant"
)
print(agent.unleash(task="Explain quantum computing"))

What Makes UnisonAI Special

UnisonAI stands out with its unique Agent-to-Agent (A2A) communication architecture, enabling seamless coordination between AI agents as if they were human team members collaborating on complex tasks.

A2A Communication Architecture

Example

Latest Enhancements

  • ๐Ÿ”’ Strong Type Validation: All tool parameters validated against ToolParameterType enum before execution
  • ๐Ÿ›ก๏ธ Enhanced Error Handling: Comprehensive error catching with detailed metadata for debugging
  • ๐Ÿ“Š Standardized Results: All tools return ToolResult objects with success status and metadata

Perfect For:

  • Complex Research Tasks: Multiple agents gathering, analyzing, and synthesizing information
  • Workflow Automation: Coordinated agents handling multi-step business processes
  • Content Creation: Specialized agents for research, writing, editing, and publishing
  • Data Analysis: Distributed agents processing large datasets with different expertise

Core Components

Component Purpose Key Features
Agent Standalone or clan member agent Own history, tool integration, configurable LLMs, inter-agent messaging
Clan Multi-agent orchestration Team management, shared goals, coordinated execution
Tool System Extensible capability framework Type validation, error handling, standardized results

Usage Examples

Individual Agent

from unisonai import Agent
from unisonai.llms import Gemini
from unisonai.tools.memory import MemoryTool

agent = Agent(
    llm=Gemini(model="gemini-2.0-flash"),
    identity="Research Assistant",
    description="An AI assistant with memory capabilities",
    tools=[MemoryTool]
)
agent.unleash(task="Store important project details")

Multi-Agent Clan

from unisonai import Agent, Clan

research_agent = Agent(llm=Gemini(), identity="Researcher", task="Gather information")
analysis_agent = Agent(llm=Gemini(), identity="Analyst", task="Analyze findings")

clan = Clan(
    clan_name="Research Team",
    manager=research_agent,
    members=[research_agent, analysis_agent],
    goal="Comprehensive market analysis"
)
clan.unleash()

Custom Tools

from unisonai.tools.tool import BaseTool, Field
from unisonai.tools.types import ToolParameterType

class CalculatorTool(BaseTool):
    def __init__(self):
        self.name = "calculator"
        self.description = "Mathematical operations"
        self.params = [
            Field(name="operation", field_type=ToolParameterType.STRING, required=True),
            Field(name="a", field_type=ToolParameterType.FLOAT, required=True),
            Field(name="b", field_type=ToolParameterType.FLOAT, required=True)
        ]
        super().__init__()

    def _run(self, operation: str, a: float, b: float) -> float:
        return a + b if operation == "add" else a * b

Configuration

API Keys

from unisonai import config

# Method 1: Configuration system
config.set_api_key("gemini", "your-key")
config.set_api_key("openai", "your-key")

# Method 2: Environment variables
export GEMINI_API_KEY="your-key"
export OPENAI_API_KEY="your-key"

# Method 3: Direct LLM initialization
llm = Gemini(api_key="your-key")

Documentation Hub

๐Ÿš€ Getting Started

๐Ÿ“– Core Documentation

๐Ÿ› ๏ธ Advanced Features

๐Ÿ’ก Examples & Tutorials


FAQ

What is UnisonAI? Python framework for building and orchestrating AI agents with A2A communication.
When should I use a Clan? For complex, multi-step tasks requiring specialized agents working together.
Can I add custom LLMs? Yes! Extend BaseLLM class to integrate any model provider.
What are tools? Reusable components that extend agent capabilities (web search, APIs, custom logic).
How do I manage API keys? Use config system, environment variables, or pass directly to LLMs.

Contributing

Founder: Anant Sharma (E5Anant)[https://github.com/E5Anant]

PRs and issues welcome! See our Contributing Guide.

Open Issues โ€ข Submit PRs โ€ข Suggest Features


About

The UnisonAI Multi-Agent Framework (A2A) provides a flexible and extensible environment for creating and coordinating multiple autonomous AI agents. UnisonAI is designed with flexibility and scalability in mind.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 5

Languages