Closed as not planned
Description
Hi everyone!
I'm trying to creation agent that will extract load details and detect warnings
After that it should handoff
to another agent
so when it extracted the details we need to send request to another service to visualize them
I want to do it using on_handoff
, is there any way to get LLM output on this function ?
def update_details() -> str:
# send request to update details with A structure
# send request to upsert warnings with B structure
return details
# Define the agents
extractor_agent = Agent[OrchestratorContext](
name="Triage Agent",
instructions=prompt_with_handoff_instructions(
"You are responsible to extract the load details ...."
),
model=OpenAIChatCompletionsModel(
model=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"), openai_client=azure_client
),
handoffs=[handoff(agent=intent_detector_agent, on_handoff=update_details)]
)