Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

litellm.drop_params error when running the openapi server #63

Open
masaruduy opened this issue Oct 7, 2024 · 1 comment
Open

litellm.drop_params error when running the openapi server #63

masaruduy opened this issue Oct 7, 2024 · 1 comment

Comments

@masaruduy
Copy link

I'm running the server normally with: python -m routellm.openai_server --routers mf --weak-model ollama_chat/codeqwen
and am getting this whenever I attempt a prompt:

File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3586, in get_optional_params
_check_valid_arg(supported_params=supported_params)
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3060, in _check_valid_arg
raise UnsupportedParamsError(
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: ollama_chat does not support parameters: {'presence_penalty': 0.0}, for model=codeqwen. To drop these, set litellm.drop_params=True or for proxy:

litellm_settings: drop_params: true

I've tried modifying the yaml to no avail.
Please help!

@masaruduy
Copy link
Author

oh and am trying to use it with continue.dev
{
"model": "router-mf-0.11593",
"title": "routellm",
"completionOptions": {},
"apiBase": "http://192.168.0.215:6060/v1",
"apiKey": "no_api_key",
"provider": "openai"
},

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant