Why this is not working? #127
-
Title: Tool calling not working with TanStack AI + Ollama (LLaMA 3) — tool never triggersCategory: Q&AProblemI’m using TanStack AI + Ollama + Server Streaming with a custom tool ( Instead of calling the tool, the model didn't send any thing but i remove it responds My SetupServer (
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
|
llama 3 doesn't support tools, maybe try using 3.1 and it should work |
Beta Was this translation helpful? Give feedback.
-
|
As @AlemTuzlak has mentioned llama 3 doesn’t support tools, I have an open pr that should add more thorough Type safety (as well a bunch more models) that will warn you if you try to pass options that the model doesn’t support. This is something I'm still trying to figure out, but from what I can tell the Ollama api swallows configuration that the models do not consume. So it's taken me some time to figure it out 😅 |
Beta Was this translation helpful? Give feedback.
llama 3 doesn't support tools, maybe try using 3.1 and it should work