You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running the server normally with: python -m routellm.openai_server --routers mf --weak-model ollama_chat/codeqwen
and am getting this whenever I attempt a prompt:
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3586, in get_optional_params
_check_valid_arg(supported_params=supported_params)
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3060, in _check_valid_arg
raise UnsupportedParamsError(
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: ollama_chat does not support parameters: {'presence_penalty': 0.0}, for model=codeqwen. To drop these, set litellm.drop_params=True or for proxy:
litellm_settings: drop_params: true
I've tried modifying the yaml to no avail.
Please help!
The text was updated successfully, but these errors were encountered:
oh and am trying to use it with continue.dev
{
"model": "router-mf-0.11593",
"title": "routellm",
"completionOptions": {},
"apiBase": "http://192.168.0.215:6060/v1",
"apiKey": "no_api_key",
"provider": "openai"
},
I'm running the server normally with: python -m routellm.openai_server --routers mf --weak-model ollama_chat/codeqwen
and am getting this whenever I attempt a prompt:
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3586, in get_optional_params
_check_valid_arg(supported_params=supported_params)
File "C:\Users\nate\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\litellm\utils.py", line 3060, in _check_valid_arg
raise UnsupportedParamsError(
litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: ollama_chat does not support parameters: {'presence_penalty': 0.0}, for model=codeqwen. To drop these, set
litellm.drop_params=True
or for proxy:litellm_settings: drop_params: true
I've tried modifying the yaml to no avail.
Please help!
The text was updated successfully, but these errors were encountered: