Skip to content

Commit

Permalink
doc & sample: Update documentation for human-in-the-loop and UserProx…
Browse files Browse the repository at this point in the history
…yAgent; Add UserProxyAgent to ChainLit sample; (#5656)

Resolves #5610

And address various questions regarding to how to use user proxy agent
and human-in-the-loop.
  • Loading branch information
ekzhu authored Feb 25, 2025
1 parent a54a85e commit a14aeab
Show file tree
Hide file tree
Showing 5 changed files with 196 additions and 14 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -39,11 +39,6 @@ class UserProxyAgent(BaseChatAgent, Component[UserProxyAgentConfig]):
This agent can be used to represent a human user in a chat system by providing a custom input function.
Args:
name (str): The name of the agent.
description (str, optional): A description of the agent.
input_func (Optional[Callable[[str], str]], Callable[[str, Optional[CancellationToken]], Awaitable[str]]): A function that takes a prompt and returns a user input string.
.. note::
Using :class:`UserProxyAgent` puts a running team in a temporary blocked
Expand All @@ -58,7 +53,17 @@ class UserProxyAgent(BaseChatAgent, Component[UserProxyAgentConfig]):
You can run the team again with the user input. This way, the state of the team
can be saved and restored when the user responds.
See `Human-in-the-loop <https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/tutorial/human-in-the-loop.html>`_ for more information.
See `Human-in-the-loop <https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/tutorial/human-in-the-loop.html>`_ for more information.
Args:
name (str): The name of the agent.
description (str, optional): A description of the agent.
input_func (Optional[Callable[[str], str]], Callable[[str, Optional[CancellationToken]], Awaitable[str]]): A function that takes a prompt and returns a user input string.
For examples of integrating with web and UI frameworks, see the following:
* `FastAPI <https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_fastapi>`_
* `ChainLit <https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_chainlit>`_
Example:
Simple usage case::
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,14 @@
"There are two main ways to interact with the team from your application:\n",
"\n",
"1. During a team's run -- execution of {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run` or {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run_stream`, provide feedback through a {py:class}`~autogen_agentchat.agents.UserProxyAgent`.\n",
"2. Once the run terminates, provide feedback through input to the next call to {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run` or {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run_stream`.\n"
"2. Once the run terminates, provide feedback through input to the next call to {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run` or {py:meth}`~autogen_agentchat.teams.BaseGroupChat.run_stream`.\n",
"\n",
"We will cover both methods in this section.\n",
"\n",
"To jump straight to code samples on integration with web and UI frameworks, see the following links:\n",
"- [AgentChat + FastAPI](https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_fastapi)\n",
"- [AgentChat + ChainLit](https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_chainlit)\n",
"- [AgentChat + Streamlit](https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_streamlit)"
]
},
{
Expand All @@ -31,6 +38,12 @@
"The team will decide when to call the {py:class}`~autogen_agentchat.agents.UserProxyAgent`\n",
"to ask for feedback from the user.\n",
"\n",
"For example in a {py:class}`~autogen_agentchat.teams.RoundRobinGroupChat` team, \n",
"the {py:class}`~autogen_agentchat.agents.UserProxyAgent` is called in the order\n",
"in which it is passed to the team, while in a {py:class}`~autogen_agentchat.teams.SelectorGroupChat`\n",
"team, the selector prompt or selector function determines when the \n",
"{py:class}`~autogen_agentchat.agents.UserProxyAgent` is called.\n",
"\n",
"The following diagram illustrates how you can use \n",
"{py:class}`~autogen_agentchat.agents.UserProxyAgent`\n",
"to get feedback from the user during a team's run:\n",
Expand Down Expand Up @@ -150,7 +163,10 @@
" # ...\n",
"```\n",
"\n",
"See the [AgentChat FastAPI sample](https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_fastapi) for a complete example."
"See the [AgentChat FastAPI sample](https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_fastapi) for a complete example.\n",
"\n",
"For [ChainLit](https://github.com/Chainlit/chainlit) integration with {py:class}`~autogen_agentchat.agents.UserProxyAgent`,\n",
"see the [AgentChat ChainLit sample](https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_chainlit)."
]
},
{
Expand Down Expand Up @@ -424,7 +440,15 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"You can see the team continued after the user provided the information."
"You can see the team continued after the user provided the information.\n",
"\n",
"```{note}\n",
"If you are using {py:class}`~autogen_agentchat.teams.Swarm` team with\n",
"{py:class}`~autogen_agentchat.conditions.HandoffTermination` targeting user,\n",
"to resume the team, you need to set the `task` to a {py:class}`~autogen_agentchat.messages.HandoffMessage`\n",
"with the `target` set to the next agent you want to run.\n",
"See [Swarm](../swarm.ipynb) for more details.\n",
"```"
]
}
],
Expand Down
20 changes: 18 additions & 2 deletions python/samples/agentchat_chainlit/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,6 @@ interacts with an [AgentChat](https://microsoft.github.io/autogen/stable/user-gu
agent or a team, using [Chainlit](https://github.com/Chainlit/chainlit),
and support streaming messages.

![AgentChat](docs/chainlit_autogen.png).

## Installation

To run this sample, you will need to install the following packages:
Expand Down Expand Up @@ -55,6 +53,24 @@ and the other one is instructed to be a critic and provide feedback.
The two agents will respond in round-robin fashion until
the 'APPROVE' is mentioned by the critic agent.

## Running the Team Sample with UserProxyAgent

The third sample demonstrate how to interact with a team of agents including
a [UserProxyAgent](https://microsoft.github.io/autogen/stable/reference/python/autogen_agentchat.agents.html#autogen_agentchat.agents.UserProxyAgent)
for approval or rejection.

```shell
chainlit run app_team_user_proxy.py -h
```

You can use one of the starters. For example, ask "Write code to reverse a string.".

By default, the `UserProxyAgent` will request an input action from the user
to approve or reject the response from the team.
When the user approves the response, the `UserProxyAgent` will send a message
to the team containing the text "APPROVE", and the team will stop responding.


## Next Steps

There are a few ways you can extend this example:
Expand Down
140 changes: 140 additions & 0 deletions python/samples/agentchat_chainlit/app_team_user_proxy.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
from typing import List, cast

import chainlit as cl
import yaml
from autogen_agentchat.agents import AssistantAgent, UserProxyAgent
from autogen_agentchat.base import TaskResult
from autogen_agentchat.conditions import TextMentionTermination
from autogen_agentchat.messages import ModelClientStreamingChunkEvent, TextMessage
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_core import CancellationToken
from autogen_core.models import ChatCompletionClient


async def user_input_func(prompt: str, cancellation_token: CancellationToken | None = None) -> str:
"""Get user input from the UI for the user proxy agent."""
try:
response = await cl.AskUserMessage(content=prompt).send()
except TimeoutError:
return "User did not provide any input within the time limit."
if response:
return response["output"] # type: ignore
else:
return "User did not provide any input."


async def user_action_func(prompt: str, cancellation_token: CancellationToken | None = None) -> str:
"""Get user action from the UI for the user proxy agent."""
try:
response = await cl.AskActionMessage(
content="Pick an action",
actions=[
cl.Action(name="approve", label="Approve", payload={"value": "approve"}),
cl.Action(name="reject", label="Reject", payload={"value": "reject"}),
],
).send()
except TimeoutError:
return "User did not provide any input within the time limit."
if response and response.get("payload"): # type: ignore
if response.get("payload").get("value") == "approve": # type: ignore
return "APPROVE." # This is the termination condition.
else:
return "REJECT."
else:
return "User did not provide any input."


@cl.on_chat_start # type: ignore
async def start_chat() -> None:
# Load model configuration and create the model client.
with open("model_config.yaml", "r") as f:
model_config = yaml.safe_load(f)
model_client = ChatCompletionClient.load_component(model_config)

# Create the assistant agent.
assistant = AssistantAgent(
name="assistant",
model_client=model_client,
system_message="You are a helpful assistant.",
model_client_stream=True, # Enable model client streaming.
)

# Create the critic agent.
critic = AssistantAgent(
name="critic",
model_client=model_client,
system_message="You are a critic. Provide constructive feedback. "
"Respond with 'APPROVE' if your feedback has been addressed.",
model_client_stream=True, # Enable model client streaming.
)

# Create the user proxy agent.
user = UserProxyAgent(
name="user",
# input_func=user_input_func, # Uncomment this line to use user input as text.
input_func=user_action_func, # Uncomment this line to use user input as action.
)

# Termination condition.
termination = TextMentionTermination("APPROVE", sources=["user"])

# Chain the assistant, critic and user agents using RoundRobinGroupChat.
group_chat = RoundRobinGroupChat([assistant, critic, user], termination_condition=termination)

# Set the assistant agent in the user session.
cl.user_session.set("prompt_history", "") # type: ignore
cl.user_session.set("team", group_chat) # type: ignore


@cl.set_starters # type: ignore
async def set_starts() -> List[cl.Starter]:
return [
cl.Starter(
label="Poem Writing",
message="Write a poem about the ocean.",
),
cl.Starter(
label="Story Writing",
message="Write a story about a detective solving a mystery.",
),
cl.Starter(
label="Write Code",
message="Write a function that merge two list of numbers into single sorted list.",
),
]


@cl.on_message # type: ignore
async def chat(message: cl.Message) -> None:
# Get the team from the user session.
team = cast(RoundRobinGroupChat, cl.user_session.get("team")) # type: ignore
# Streaming response message.
streaming_response: cl.Message | None = None
# Stream the messages from the team.
async for msg in team.run_stream(
task=[TextMessage(content=message.content, source="user")],
cancellation_token=CancellationToken(),
):
if isinstance(msg, ModelClientStreamingChunkEvent):
# Stream the model client response to the user.
if streaming_response is None:
# Start a new streaming response.
streaming_response = cl.Message(content="", author=msg.source)
await streaming_response.stream_token(msg.content)
elif streaming_response is not None:
# Done streaming the model client response.
# We can skip the current message as it is just the complete message
# of the streaming response.
await streaming_response.send()
# Reset the streaming response so we won't enter this block again
# until the next streaming response is complete.
streaming_response = None
elif isinstance(msg, TaskResult):
# Send the task termination message.
final_message = "Task terminated. "
if msg.stop_reason:
final_message += msg.stop_reason
await cl.Message(content=final_message).send()
else:
# Skip all other message types.
pass
3 changes: 0 additions & 3 deletions python/samples/agentchat_chainlit/docs/chainlit_autogen.png

This file was deleted.

0 comments on commit a14aeab

Please sign in to comment.