Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for AWS Bedrock integration #5226

Open
Hama18 opened this issue Jan 28, 2025 · 10 comments · May be fixed by #6170
Open

Add support for AWS Bedrock integration #5226

Hama18 opened this issue Jan 28, 2025 · 10 comments · May be fixed by #6170
Assignees
Milestone

Comments

@Hama18
Copy link

Hama18 commented Jan 28, 2025

What feature would you like to be added?

  • New Module: Create a new module for AWS Bedrock integration.
  • New Class: Implement a new class .autogen_ext.models.aws.BedrockChatCompletionClient based on the ChatCompletionClient base class.
  • Reference: Please refer to existing extensions such as AzureAIChatCompletionClient for guidance.

Why is this needed?

Benefits:

  • Official support for AWS Bedrock would ensure stability, ease of use, and proper maintenance.
  • It would enhance the functionality of AutogenStudio for users who rely on AWS services.

Additional Information:
I am willing to contribute to the development of this feature. Please provide any guidance or steps on how I might be able to assist in making this feature officially supported.
In my environment, the setting for model_info is not passed from the JSON file, so I confirmed the operation by directly passing the arguments in the source code.

Thank you for your time and consideration.

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 28, 2025

Thank you @Hama18, for start, you can take a look at the CONTRIBUTING.md for how to run tests locally. You can check with us when you are stuck.

For API documentation, it would be good to have at least 1 runnable code block showing a hello world example. Would be good to also show a streaming example.

The two examples from their official doc is great:

https://docs.aws.amazon.com/code-library/latest/ug/python_3_bedrock-runtime_code_examples.html

For dependency, add aws for the boto3 dependency, with a minimal verison bound set to the latest stable release verison.

For unit test, create unit tests following those for the Azure AI client. Also create one that hits the actual endpoint but use pytest.skip to skip it if access was denied. See existing test against OpenAI and Gemini endpoints.

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 29, 2025

It is also possible to use Semantic Kernel client adapter for AWS Bedrock models. See: https://microsoft.github.io/autogen/stable/reference/python/autogen_ext.models.semantic_kernel.html

ekzhu added a commit that referenced this issue Jan 30, 2025
@Hama18
Copy link
Author

Hama18 commented Jan 31, 2025

I apologize for any inconvenience, but I found that my requirements can be met with AWS Bedrock, which is compatible with OpenAI.

From the discussion below, I learned that it can be called from OpenAIChatCompletionClient, and I have completed what I needed to do.

If no further extension of Bedrock is necessary, please feel free to close this issue.

@mynewstart
Copy link

I apologize for any inconvenience, but I found that my requirements can be met with AWS Bedrock, which is compatible with OpenAI.

From the discussion below, I learned that it can be called from OpenAIChatCompletionClient, and I have completed what I needed to do.

If no further extension of Bedrock is necessary, please feel free to close this issue.

Can you share your code? I still can not use bedrock model in autogen0.4

@Hama18
Copy link
Author

Hama18 commented Mar 11, 2025

I apologize for the lack of explanation earlier. I am calling AWS Bedrock via Databricks' serving endpoint. Since the serving endpoint is OpenAI-compatible, I was able to invoke it using the OpenAIChatCompletionClient. It functions by assigning the Databricks access key to the api_key and the Databricks serving endpoint to the base_url.

My environment is autogen 0.4.1.

@mynewstart
Copy link

Thanks, I can also call bedrock by using both OpenAI-compatible or Semantic Kernel client

@Alex-Wenner-FHR
Copy link

Alex-Wenner-FHR commented Mar 25, 2025

It is also possible to use Semantic Kernel client adapter for AWS Bedrock models. See: https://microsoft.github.io/autogen/stable/reference/python/autogen_ext.models.semantic_kernel.html

I am having some issues with using it through the agent approach. I can call the .create() method and it works well, but I can never get a response out of the AssistantAgent.

I am using AssistantAgent with Swarm. For transparency I am using agents across azure openai and bedrock. I figured this wouldn't be an issue but maybe that is problematic. If you have any thoughts on this anything would be appreciated!


Tried solely Bedrock and still did not work.

@ekzhu

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 25, 2025

Thanks for the tips. Would you like to help with creating a bedrock client? @Alex-Wenner-FHR

@Alex-Wenner-FHR
Copy link

Alex-Wenner-FHR commented Mar 26, 2025

Thanks for the tips. Would you like to help with creating a bedrock client? @Alex-Wenner-FHR

I would be happy to help. Are those efforts already underway? I thought I had seen some related pull requests.

I also am just generally asking if I am misunderstanding something... I am using the Semantic Kernal wrapper around bedrock but am not getting any errors or anything, I just never get responses from the assistant agents, in swarm, using bedrock... let me know if you have any thoughts!

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 27, 2025

Without having access to your code I can't know for sure what happened. Could you use a separate GitHub discussion to post your code?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants