Skip to content

Commit fe343d6

Browse files
authored
fix: bedrock llm error when evaluating rag qa validate_api_key (#350)
Add a conditional statement to check whether `validate_api_key` exists for the ChatModel Object from Langchain #348
1 parent 6666c15 commit fe343d6

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

src/ragas/metrics/base.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,8 @@ def init_model(self):
118118
to load all the models
119119
Also check if the api key is valid for OpenAI and AzureOpenAI
120120
"""
121-
self.llm.validate_api_key()
121+
if hasattr(self.llm, "validate_api_key"):
122+
self.llm.validate_api_key()
122123
if hasattr(self, "embeddings"):
123124
# since we are using Langchain Embeddings directly, we need to check this
124125
if hasattr(self.embeddings, "validate_api_key"):

0 commit comments

Comments
 (0)