Skip to content

Llama 3.1-70b not supported anymore on watsonx #20

@wernergeyer

Description

@wernergeyer

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/wernergeyer/Projects/eval-assist-standalone/.venv/lib/python3.12/site-packages/evalassist/utils.py", line 387, in wrapper
handle_exception(e)
File "/Users/wernergeyer/Projects/eval-assist-standalone/.venv/lib/python3.12/site-packages/evalassist/utils.py", line 372, in handle_exception
handle_exception(e.cause)
File "/Users/wernergeyer/Projects/eval-assist-standalone/.venv/lib/python3.12/site-packages/evalassist/utils.py", line 352, in handle_exception
raise HTTPException(status_code=400, detail=error_message)
fastapi.exceptions.HTTPException: 400: Model 'meta-llama/llama-3-1-70b-instruct' is not supported

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions