Closed
Description
when i ran the following code, I got the error ollama._types.ResponseError
, I don't why this happened, can anyone help me ? thanks in advance.
code
def test():
resp = ollama.chat(
model='llama3.1:8b', messages=[{'role': 'user', 'content': 'output the number from 1 to 5'}])
print(resp)
test()
here are the downloaded LLM:
/code/gencf# ollama list
NAME ID SIZE MODIFIED
llama3.1:8b 46e0c10c039e 4.9 GB 53 minutes ago
llama3.1:latest 46e0c10c039e 4.9 GB 55 minutes ago
mistral:latest f974a74358d6 4.1 GB 9 hours ago
Error:
Traceback (most recent call last):
File "/code/gencf/multi_gpu_proc_generate.py", line 270, in <module>
test()
File "/code/gencf/multi_gpu_proc_generate.py", line 228, in test
resp = ollama.chat(
File "/root/anaconda3/envs/llmxq12/lib/python3.10/site-packages/ollama/_client.py", line 332, in chat
return self._request(
File "/root/anaconda3/envs/llmxq12/lib/python3.10/site-packages/ollama/_client.py", line 177, in _request
return cls(**self._request_raw(*args, **kwargs).json())
File "/root/anaconda3/envs/llmxq12/lib/python3.10/site-packages/ollama/_client.py", line 122, in _request_raw
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError
Metadata
Metadata
Assignees
Labels
No labels