Skip to content

Commit

Permalink
Python: Adding Crew.AI as a plugin. (#10474)
Browse files Browse the repository at this point in the history
### Motivation and Context

Adding Crew.AI as a plugin to the Semantic Kernel will allow users to
leverage the capabilities of Crew.AI in app. This plugin works with
Crews that have been deployed to the Crew.AI Enterprise service.

### Description

Adding Crew.AI as a plugin to the Semantic Kernel will allow users to
leverage the capabilities of Crew.AI in app. This plugin works with
Crews that have been deployed to the Crew.AI Enterprise service.

The plugin functionality has been manually verified with the concept
samples. Other tests will come in the next update.

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [x] The code builds clean without any errors or warnings
- [x] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [x] All unit tests pass, and I have added new tests where possible
- [x] I didn't break anyone 😄

---------

Co-authored-by: Ben Thomas <[email protected]>
Co-authored-by: Eduard van Valkenburg <[email protected]>
  • Loading branch information
3 people authored Feb 13, 2025
1 parent 88fbaf8 commit 892c206
Show file tree
Hide file tree
Showing 10 changed files with 847 additions and 105 deletions.
4 changes: 3 additions & 1 deletion python/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,6 @@ BOOKING_SAMPLE_CLIENT_ID=""
BOOKING_SAMPLE_TENANT_ID=""
BOOKING_SAMPLE_CLIENT_SECRET=""
BOOKING_SAMPLE_BUSINESS_ID=""
BOOKING_SAMPLE_SERVICE_ID=""
BOOKING_SAMPLE_SERVICE_ID=""
CREW_AI_ENDPOINT=""
CREW_AI_TOKEN=""
47 changes: 47 additions & 0 deletions python/samples/concepts/plugins/crew_ai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Crew AI Plugin for Semantic Kernel

This sample demonstrates how to integrate with [Crew AI Enterprise](https://app.crewai.com/) crews in Semantic Kernel.

## Requirements

Before running this sample you need to have a Crew deployed to the Crew AI Enterprise cloud. Many pre-built Crew templates can be found [here](https://app.crewai.com/crewai_plus/templates). You will need the following information from your deployed Crew:

- endpoint: The base URL for your Crew.
- authentication token: The authentication token for your Crew
- required inputs: Most Crews have a set of required inputs that need to provided when kicking off the Crew and those input names, types, and values need to be known.

- ## Using the Crew Plugin

Once configured, the `CrewAIEnterprise` class can be used directly by calling methods on it, or can be used to generate a Semantic Kernel plugin with inputs that match those of your Crew. Generating a plugin is useful for scenarios when you want an LLM to be able to invoke your Crew as a tool.

## Running the sample

1. Deploy your Crew to the Crew Enterprise cloud.
1. Gather the required information listed above.
1. Create environment variables or use your .env file to define your Crew's endpoint and token as:

```md
CREW_AI_ENDPOINT="{Your Crew's endpoint}"
CREW_AI_TOKEN="{Your Crew's authentication token}"
```

1. In [crew_ai_plugin.py](./crew_ai_plugin.py) find the section that defines the Crew's required inputs and modify it to match your Crew's inputs. The input descriptions and types are critical to help LLMs understand the inputs semantic meaning so that it can accurately call the plugin. The sample is based on the `Enterprise Content Marketing Crew` template which has two required inputs, `company` and `topic`.

```python
# The required inputs for the Crew must be known in advance. This example is modeled after the
# Enterprise Content Marketing Crew Template and requires string inputs for the company and topic.
# We need to describe the type and purpose of each input to allow the LLM to invoke the crew as expected.
crew_plugin_definitions = [
KernelParameterMetadata(
name="company",
type="string",
description="The name of the company that should be researched",
is_required=True,
),
KernelParameterMetadata(
name="topic", type="string", description="The topic that should be researched", is_required=True
),
]
```

1. Run the sample. Notice that the sample invokes (kicks-off) the Crew twice, once directly by calling the `kickoff` method and once by creating a plugin and invoking it.
140 changes: 140 additions & 0 deletions python/samples/concepts/plugins/crew_ai/crew_ai_plugin.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
# Copyright (c) Microsoft. All rights reserved.

import asyncio
import logging

from samples.concepts.setup.chat_completion_services import Services, get_chat_completion_service_and_request_settings
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.chat_completion_client_base import ChatCompletionClientBase
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
from semantic_kernel.connectors.ai.prompt_execution_settings import PromptExecutionSettings
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.core_plugins.crew_ai import CrewAIEnterprise
from semantic_kernel.functions.kernel_parameter_metadata import KernelParameterMetadata

logging.basicConfig(level=logging.INFO)


async def using_crew_ai_enterprise():
# Create an instance of the CrewAI Enterprise Crew
async with CrewAIEnterprise() as crew:
#####################################################################
# Using the CrewAI Enterprise Crew directly #
#####################################################################

# The required inputs for the Crew must be known in advance. This example is modeled after the
# Enterprise Content Marketing Crew Template and requires the following inputs:
inputs = {"company": "CrewAI", "topic": "Agentic products for consumers"}

# Invoke directly with our inputs
kickoff_id = await crew.kickoff(inputs)
print(f"CrewAI Enterprise Crew kicked off with ID: {kickoff_id}")

# Wait for completion
result = await crew.wait_for_crew_completion(kickoff_id)
print("CrewAI Enterprise Crew completed with the following result:")
print(result)

#####################################################################
# Using the CrewAI Enterprise as a Plugin #
#####################################################################

# Define the description of the Crew. This will used as the semantic description of the plugin.
crew_description = (
"Conducts thorough research on the specified company and topic to identify emerging trends,"
"analyze competitor strategies, and gather data-driven insights."
)

# The required inputs for the Crew must be known in advance. This example is modeled after the
# Enterprise Content Marketing Crew Template and requires string inputs for the company and topic.
# We need to describe the type and purpose of each input to allow the LLM to invoke the crew as expected.
crew_input_parameters = [
KernelParameterMetadata(
name="company",
type="string",
type_object=str,
description="The name of the company that should be researched",
is_required=True,
),
KernelParameterMetadata(
name="topic",
type="string",
type_object=str,
description="The topic that should be researched",
is_required=True,
),
]

# Create the CrewAI Plugin. This builds a plugin that can be added to the Kernel and invoked like any other
# plugin. The plugin will contain the following functions:
# - kickoff: Starts the Crew with the specified inputs and returns the Id of the scheduled kickoff.
# - kickoff_and_wait: Starts the Crew with the specified inputs and waits for the Crew to complete before
# returning the result.
# - wait_for_completion: Waits for the specified Crew kickoff to complete and returns the result.
# - get_status: Gets the status of the specified Crew kickoff.
crew_plugin = crew.create_kernel_plugin(
name="EnterpriseContentMarketingCrew",
description=crew_description,
parameters=crew_input_parameters,
)

# Configure the kernel for chat completion and add the CrewAI plugin.
kernel, chat_completion, settings = configure_kernel_for_chat()
kernel.add_plugin(crew_plugin)

# Create a chat history to store the system message, initial messages, and the conversation.
history = ChatHistory()
history.add_system_message("You are an AI assistant that can help me with research.")
history.add_user_message(
"I'm looking for emerging marketplace trends about Crew AI and their concumer AI products."
)

# Invoke the chat completion service with enough information for the CrewAI plugin to be invoked.
response = await chat_completion.get_chat_message_content(history, settings, kernel=kernel)
print(response)

# expected output:
# INFO:semantic_kernel.connectors.ai.open_ai.services.open_ai_handler:OpenAI usage: ...
# INFO:semantic_kernel.connectors.ai.chat_completion_client_base:processing 1 tool calls in parallel.
# INFO:semantic_kernel.kernel:Calling EnterpriseContentMarketingCrew-kickoff_and_wait function with args:
# {"company":"Crew AI","topic":"emerging marketplace trends in consumer AI products"}
# INFO:semantic_kernel.functions.kernel_function:Function EnterpriseContentMarketingCrew-kickoff_and_wait
# invoking.
# INFO:semantic_kernel.core_plugins.crew_ai.crew_ai_enterprise:CrewAI Crew kicked off with Id: *****
# INFO:semantic_kernel.core_plugins.crew_ai.crew_ai_enterprise:CrewAI Crew with kickoff Id: ***** completed with
# status: SUCCESS
# INFO:semantic_kernel.functions.kernel_function:Function EnterpriseContentMarketingCrew-kickoff_and_wait
# succeeded.
# Here are some emerging marketplace trends related to Crew AI and their consumer AI products, along with
# suggested content pieces to explore these trends: ...


def configure_kernel_for_chat() -> tuple[Kernel, ChatCompletionClientBase, PromptExecutionSettings]:
kernel = Kernel()

# You can select from the following chat completion services that support function calling:
# - Services.OPENAI
# - Services.AZURE_OPENAI
# - Services.AZURE_AI_INFERENCE
# - Services.ANTHROPIC
# - Services.BEDROCK
# - Services.GOOGLE_AI
# - Services.MISTRAL_AI
# - Services.OLLAMA
# - Services.ONNX
# - Services.VERTEX_AI
# - Services.DEEPSEEK
# Please make sure you have configured your environment correctly for the selected chat completion service.
chat_completion_service, request_settings = get_chat_completion_service_and_request_settings(Services.OPENAI)

# Configure the function choice behavior. Here, we set it to Auto, where auto_invoke=True by default.
# With `auto_invoke=True`, the model will automatically choose and call functions as needed.
request_settings.function_choice_behavior = FunctionChoiceBehavior.Auto()

# Pass the request settings to the kernel arguments.
kernel.add_service(chat_completion_service)
return kernel, chat_completion_service, request_settings


if __name__ == "__main__":
asyncio.run(using_crew_ai_enterprise())
11 changes: 11 additions & 0 deletions python/semantic_kernel/core_plugins/crew_ai/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Copyright (c) Microsoft. All rights reserved.

from semantic_kernel.core_plugins.crew_ai.crew_ai_enterprise import CrewAIEnterprise
from semantic_kernel.core_plugins.crew_ai.crew_ai_models import (
CrewAIStatusResponse,
)
from semantic_kernel.core_plugins.crew_ai.crew_ai_settings import (
CrewAISettings,
)

__all__ = ["CrewAIEnterprise", "CrewAISettings", "CrewAIStatusResponse"]
Loading

0 comments on commit 892c206

Please sign in to comment.