Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add anthropic function calling with streaming example #131

Merged
merged 2 commits into from
Feb 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions anthropic-functions-streaming/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ANTHROPIC_API_KEY=your_api_key_here
54 changes: 54 additions & 0 deletions anthropic-functions-streaming/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Chainlit Anthropic Example with Function Calling

This project demonstrates how to create a chatbot using Chainlit and Anthropic's Claude AI model, showcasing function calling capabilities with example tools.

## Overview

- Integrates Chainlit with Claude 3 Sonnet model
- Demonstrates function calling with two example tools:
- Mock weather lookup
- Simple calculator

## Quick Start

1. Install dependencies:
```
pip install -r requirements.txt
```

2. Set your Anthropic API key in a .env file similar to .env.example:
```
ANTHROPIC_API_KEY=your_api_key_here
```

3. Run the app:
```
chainlit run app.py
```

## Key Components

1. **Example Tools**:
- `get_current_weather`: Returns mock weather data
- `calculator`: Performs basic arithmetic

2. **Chainlit Setup**:
- `@cl.on_chat_start`: Initializes chat session
- `@cl.on_message`: Handles user messages

3. **Claude Integration**:
- `call_claude`: Manages communication with Claude

4. **Function Calling**:
- `@cl.step(type="tool")`: Handles tool execution
- `call_tool`: Routes to appropriate tool function

## Customization

- Add new tools to the `tools` list
- Implement corresponding functions in `TOOL_FUNCTIONS`
- Modify `SYSTEM` prompt or `MODEL_NAME` as needed

## Note

This is a demonstration project. The weather tool uses mock data, and the calculator is basic. In a production environment, replace these with real APIs and more robust implementations.
181 changes: 181 additions & 0 deletions anthropic-functions-streaming/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,181 @@
import chainlit as cl
import json
from anthropic import AsyncAnthropic
SYSTEM = "you are a helpful assistant."
MODEL_NAME = "claude-3-5-sonnet-20240620"
c = AsyncAnthropic()

# hard coded weather tool
async def get_current_weather(location, unit):
"""Get the current weather in a given location"""
unit = unit or "Farenheit"
weather_info = {
"location": location,
"temperature": "74",
"unit": unit,
"forecast": ["sunny", "windy"],
}

return json.dumps(weather_info)

# simple calculator tool
async def calculator(operation, operand1, operand2):
"""Perform a basic arithmetic operation"""
result = None

if operation == "add":
result = operand1 + operand2
elif operation == "subtract":
result = operand1 - operand2
elif operation == "multiply":
result = operand1 * operand2
elif operation == "divide":
if operand2 != 0:
result = operand1 / operand2
else:
return json.dumps({"error": "Division by zero is not allowed"})
else:
return json.dumps({"error": "Invalid operation"})

calculation_info = {
"operation": operation,
"operand1": operand1,
"operand2": operand2,
"result": result
}

return json.dumps(calculation_info)

# tool descriptions
tools = [
{
"name": "get_current_weather",
"description": "Get the current weather for a specified location.",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. 'San Francisco, CA'"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The unit of temperature to use (celsius or fahrenheit)"
}
},
"required": ["location"]
}
},
{
"name": "calculator",
"description": "A simple calculator that performs basic arithmetic operations.",
"input_schema": {
"type": "object",
"properties": {
"operation": {
"type": "string",
"enum": ["add", "subtract", "multiply", "divide"],
"description": "The arithmetic operation to perform."
},
"operand1": {
"type": "number",
"description": "The first operand."
},
"operand2": {
"type": "number",
"description": "The second operand."
}
},
"required": ["operation", "operand1", "operand2"]
}
}
]

# tool mappings
TOOL_FUNCTIONS = {
"get_current_weather": get_current_weather,
"calculator": calculator
}

# send chat messages to claude api
async def call_claude(chat_messages):
msg = cl.Message(content="", author="Claude")

async with c.messages.stream(
max_tokens=1024,
system=SYSTEM,
messages=chat_messages,
tools=tools,
model="claude-3-5-sonnet-20240620",
) as stream:
async for text in stream.text_stream:
await msg.stream_token(text)

await msg.send()
response = await stream.get_final_message()

return response

# initialise chat
@cl.on_chat_start
async def start_chat():
cl.user_session.set("chat_messages", [])

# route to functions based on tool call
@cl.step(type="tool")
async def call_tool(tool_use):
tool_name = tool_use.name
tool_input = tool_use.input

current_step = cl.context.current_step
current_step.name = tool_name

tool_function = TOOL_FUNCTIONS.get(tool_name)

if tool_function:
try:
current_step.output = await tool_function(**tool_input)
except TypeError:
current_step.output = json.dumps({"error": f"Invalid input for {tool_name}"})
else:
current_step.output = json.dumps({"error": f"Invalid tool: {tool_name}"})

current_step.language = "json"
return current_step.output

# main chat
@cl.on_message
async def chat(message: cl.Message):
chat_messages = cl.user_session.get("chat_messages")
chat_messages.append({"role": "user", "content": message.content})
response = await call_claude(chat_messages)

while response.stop_reason == "tool_use":
tool_use = next(block for block in response.content if block.type == "tool_use")
tool_result = await call_tool(tool_use)

messages = [
{"role": "assistant", "content": response.content},
{
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": tool_use.id,
"content": str(tool_result),
}
],
},
]

chat_messages.extend(messages)
response = await call_claude(chat_messages)

final_response = next(
(block.text for block in response.content if hasattr(block, "text")),
None,
)

chat_messages = cl.user_session.get("chat_messages")
chat_messages.append({"role": "assistant", "content": final_response})
14 changes: 14 additions & 0 deletions anthropic-functions-streaming/chainlit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Welcome to Chainlit! 🚀🤖

Hi there, Developer! 👋 We're excited to have you on board. Chainlit is a powerful tool designed to help you prototype, debug and share applications built on top of LLMs.

## Useful Links 🔗

- **Documentation:** Get started with our comprehensive [Chainlit Documentation](https://docs.chainlit.io) 📚
- **Discord Community:** Join our friendly [Chainlit Discord](https://discord.gg/k73SQ3FyUh) to ask questions, share your projects, and connect with other developers! 💬

We can't wait to see what you create with Chainlit! Happy coding! 💻😊

## Welcome screen

To modify the welcome screen, edit the `chainlit.md` file at the root of your project. If you do not want a welcome screen, just leave this file empty.
69 changes: 69 additions & 0 deletions anthropic-functions-streaming/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
aiofiles==23.2.1
annotated-types==0.7.0
anthropic==0.30.1
anyio==3.7.1
asyncer==0.0.2
bidict==0.23.1
certifi==2024.6.2
chainlit==1.1.304
charset-normalizer==3.3.2
chevron==0.14.0
click==8.1.7
dataclasses-json==0.5.14
Deprecated==1.2.14
distro==1.9.0
fastapi==0.110.3
filelock==3.15.4
filetype==1.2.0
fsspec==2024.6.0
googleapis-common-protos==1.63.1
grpcio==1.64.1
h11==0.14.0
httpcore==1.0.5
httpx==0.27.0
huggingface-hub==0.23.4
idna==3.7
importlib_metadata==7.1.0
jiter==0.4.2
Lazify==0.4.0
literalai==0.0.604
marshmallow==3.21.3
mypy-extensions==1.0.0
nest-asyncio==1.6.0
numpy==1.26.4
opentelemetry-api==1.25.0
opentelemetry-exporter-otlp==1.25.0
opentelemetry-exporter-otlp-proto-common==1.25.0
opentelemetry-exporter-otlp-proto-grpc==1.25.0
opentelemetry-exporter-otlp-proto-http==1.25.0
opentelemetry-instrumentation==0.46b0
opentelemetry-proto==1.25.0
opentelemetry-sdk==1.25.0
opentelemetry-semantic-conventions==0.46b0
packaging==23.2
protobuf==4.25.3
pydantic==2.7.4
pydantic_core==2.18.4
PyJWT==2.8.0
python-dotenv==1.0.1
python-engineio==4.9.1
python-multipart==0.0.9
python-socketio==5.11.3
PyYAML==6.0.1
requests==2.32.3
simple-websocket==1.0.0
sniffio==1.3.1
starlette==0.37.2
syncer==2.0.3
tokenizers==0.19.1
tomli==2.0.1
tqdm==4.66.4
typing-inspect==0.9.0
typing_extensions==4.12.2
uptrace==1.24.0
urllib3==2.2.2
uvicorn==0.25.0
watchfiles==0.20.0
wrapt==1.16.0
wsproto==1.2.0
zipp==3.19.2