Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: bedrock llm error when evaluating rag qa validate_api_key #350

Merged

Conversation

arm-diaz
Copy link
Contributor

Add a conditional statement to check whether validate_api_key exists for the ChatModel Object from Langchain #348

@arm-diaz
Copy link
Contributor Author

arm-diaz commented Nov 29, 2023

This issue is already resolved with the fix of this PR.

Copy link
Member

@jjmachan jjmachan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you so much for the fix @arm-diaz 😄

@jjmachan jjmachan merged commit fe343d6 into explodinggradients:main Nov 30, 2023
@jjmachan
Copy link
Member

Hey @arm-diaz
Thank you so much for helping us improve ragas with your PR ❤️
Now since it's Christmas and all we wanted to send you a postcard and a couple of stickers as our way of saying thank you for your contribution. If you are interested could you shoot an email at [email protected] and I'll tell you more?

Cheers 🙂
Jithin

@arm-diaz
Copy link
Contributor Author

Hey,

Thank you! I will get in touch with that email. Sorry for disappearing the last couple of days/weeks. I was super busy.

I'll test and review again the framework against bedrock this week.

Have a great Christmas and holiday season!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants