-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama._types.ResponseError ResponseError(e.response.text, e.response.status_code) from None #396
Comments
Got the same error for Win10. |
getting the same error |
@Siki-cloud what's the text returned in the response error? @Rakshitha7989 - are you able to run from CLI? |
I also encountered the same error in win10. I modified the location of the ollama model. Is it related to this reason? **from ollama import chat response: ChatResponse = chat(model='glm4:latest', messages=[ or access fields directly from the response objectprint(response.message.content)** C:\Users\Administrator>ollama list **D:\anaconda\envs\stu\python.exe D:\aaproject\learn-langchain\3.py |
But I can load the model using from langchain_community.llms import Ollama. This method does not report an error, why is this? `# 使用本地ollama调用模型,简单实现一个链式回答 模板template = """ |
Rolling out a change with some better error messaging. Until then could all of you make sure that |
Yes, I was able to use ollama in cmd, and I was able to use ollama via langchain's from langchain_community.llms import Ollama, but I get an error when I use the ollama-python library. I will provide more information but need guidance on what to do |
@Zyg187 could you try passing in the url when instantiating the ollama client under |
@ParthSareen Sorry to keep you waiting. I'm a novice who is new to contact, I don't quite understand what you mean, can you provide a simple piece of code for me to test? |
@ParthSareen I understand what you mean, and I have found the cause of the problem. Because I opened a proxy, I can't access it locally.The code runs successfully after turning off the proxy. I am curious about why the method "from langchain_community.llms import Ollama" can be used normally when using a proxy. How can I use the ollama-python official library normally when opening a proxy? Can you give me some suggestions?
|
Yeah unsure why langchain works - I wonder if it's not the proxied instance. Could you check this out and see if it helps? https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy |
when i ran the following code, I got the error
ollama._types.ResponseError
, I don't why this happened, can anyone help me ? thanks in advance.code
here are the downloaded LLM:
Error:
The text was updated successfully, but these errors were encountered: