Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'NoneType' object has no attribute '_PENDING' #1840

Open
yidasanqian opened this issue Jan 13, 2025 · 0 comments
Open

AttributeError: 'NoneType' object has no attribute '_PENDING' #1840

yidasanqian opened this issue Jan 13, 2025 · 0 comments
Labels
bug Something isn't working module-metrics this is part of metrics module question Further information is requested

Comments

@yidasanqian
Copy link

[] I have checked the documentation and related resources and couldn't resolve my bug.

Describe the bug
AttributeError: 'NoneType' object has no attribute '_PENDING'

Ragas version: 0.2.10
Python version: 3.12.8

Code to Reproduce

import os
from ragas import SingleTurnSample
from ragas.metrics import AspectCritic
from langchain_openai import AzureChatOpenAI
from langchain_openai import AzureOpenAIEmbeddings
from ragas.llms import LangchainLLMWrapper
from ragas.embeddings import LangchainEmbeddingsWrapper

azure_config = {
    "base_url": os.getenv("AZURE_OPENAI_API_BASE"),  # your endpoint
    "api_key": os.getenv("AZURE_OPENAI_API_KEY"),  # your model deployment name
    "model_name": os.getenv("AZURE_OPENAI_MODEL_NAME"),  # your model name
    "api_version": os.getenv("AZURE_OPENAI_API_VERSION"),  # your model version
    "embedding_deployment": "text-embedding-ada-002",  # your embedding deployment name
    "embedding_name": "text-embedding-ada-002",  # your embedding name
}


evaluator_llm = LangchainLLMWrapper(AzureChatOpenAI(
    openai_api_version=azure_config["api_version"],
    azure_endpoint=azure_config["base_url"],
    azure_deployment=azure_config["model_name"],
    model=azure_config["model_name"],
    validate_base_url=False,
))

# init the embeddings for answer_relevancy, answer_correctness and answer_similarity
evaluator_embeddings = LangchainEmbeddingsWrapper(AzureOpenAIEmbeddings(
    openai_api_version=azure_config["api_version"],
    azure_endpoint=azure_config["base_url"],
    azure_deployment=azure_config["embedding_deployment"],
    model=azure_config["embedding_name"],
))

test_data = {
    "user_input": "summarise given text\nThe company reported an 8% rise in Q3 2024, driven by strong performance in the Asian market. Sales in this region have significantly contributed to the overall growth. Analysts attribute this success to strategic marketing and product localization. The positive trend in the Asian market is expected to continue into the next quarter.",
    "response": "The company experienced an 8% increase in Q3 2024, largely due to effective marketing strategies and product adaptation, with expectations of continued growth in the coming quarter.",
}

metric = AspectCritic(name="summary_accuracy",llm=evaluator_llm, definition="Verify if the summary is accurate.")
test_data = SingleTurnSample(**test_data)

evaluation_result = metric.single_turn_score(test_data)
print(evaluation_result)

Error trace

d:\Develop\conda\envs\llm-evaluator\Lib\site-packages\pysbd\segmenter.py:66: SyntaxWarning: invalid escape sequence '\s'
  for match in re.finditer('{0}\s*'.format(re.escape(sent)), self.original_text):
d:\Develop\conda\envs\llm-evaluator\Lib\site-packages\pysbd\lang\arabic.py:29: SyntaxWarning: invalid escape sequence '\.'
  txt = re.sub('(?<={0})\.'.format(am), '∯', txt)
d:\Develop\conda\envs\llm-evaluator\Lib\site-packages\pysbd\lang\persian.py:29: SyntaxWarning: invalid escape sequence '\.'
  txt = re.sub('(?<={0})\.'.format(am), '∯', txt)
1
Exception ignored in: <function Task.__del__ at 0x000001A5AA03E5C0>
Traceback (most recent call last):
  File "d:\Develop\conda\envs\llm-evaluator\Lib\asyncio\tasks.py", line 143, in __del__
AttributeError: 'NoneType' object has no attribute '_PENDING'
Exception ignored in: <function Task.__del__ at 0x000001A5AA03E5C0>
Traceback (most recent call last):
  File "d:\Develop\conda\envs\llm-evaluator\Lib\asyncio\tasks.py", line 143, in __del__
AttributeError: 'NoneType' object has no attribute '_PENDING'

how fix it?

@yidasanqian yidasanqian added the bug Something isn't working label Jan 13, 2025
@dosubot dosubot bot added module-metrics this is part of metrics module question Further information is requested labels Jan 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working module-metrics this is part of metrics module question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant