Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Endpoint Error when message.content not a string #5985

Open
1 task done
Kristovich opened this issue Feb 22, 2025 · 0 comments
Open
1 task done

[Bug]: Endpoint Error when message.content not a string #5985

Kristovich opened this issue Feb 22, 2025 · 0 comments
Labels
🐛 bug Something isn't working

Comments

@Kristovich
Copy link

What happened?

Minimal Reproduction Using OpenRouter:

  1. Create an agent with the Google tool (I used google/gemini-2.0-flash-001 as the model)
  2. Ask it to search something
  3. Switch to another LLM like meta-llama/llama-3.3-70b-instruct
  4. Send any message

This will fail with
error: [handleAbortError] AI response error; aborting request: Provider returned error

A message history with a tool call also fails when using other openrouter llms, I guess it depends on whether the underlying provider supports it.

I was also able to get it to fail natively using the Perplexity Provider in LibreChat. Google, OpenAi, and Anthropic work properly.

I've narrowed it down; some providers doesn't like message.content to be anything other than a string.

Try it yourself using curl, or postman or something:
POST to https://openrouter.ai/api/v1/chat/completions

{
  "model": "meta-llama/llama-3.3-70b-instruct",
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "google",
        "description": "A search engine optimized for comprehensive, accurate, and trusted results. Useful for when you need to answer questions about current events.",
        "parameters": {
          "type": "object",
          "properties": {
            "query": {
              "type": "string",
              "minLength": 1,
              "description": "The search query string."
            },
            "max_results": {
              "type": "number",
              "minimum": 1,
              "maximum": 10,
              "description": "The maximum number of search results to return. Defaults to 10."
            }
          },
          "required": [
            "query",
            "max_results"
          ],
          "additionalProperties": false,
          "$schema": "http://json-schema.org/draft-07/schema#"
        }
      }
    }
  ],
  "messages": [
  {
    "role": "system",
    "content": "Use Google tool if needed"
  },
  {
    "role": "user",
    "content": [
      {
        "type": "text",
        "text": "search something random for me real quick, I'm testing something"
      }
    ]
  },
  {
    "role": "assistant",
    "content": null,
    "tool_calls": [
      {
        "id": "tool_0_google",
        "type": "function",
        "function": {
          "name": "google",
          "arguments": "{\"query\":\"the history of rubber ducks\",\"max_results\":5}"
        }
      }
    ]
  },
  {
    "role": "tool",
    "content": "[OUTPUT_OMITTED_FOR_BREVITY]",
    "name": "google",
    "tool_call_id": "tool_0_google"
  },
  {
    "role": "assistant",
    "content": [
      {
        "type": "text",
        "text": "Okay, I searched for \"the history of rubber ducks\" and found some interesting results. It seems rubber ducks were invented in the late 19th century, and the first ones were solid and not intended to float.\n"
      }
    ]
  },
  {
    "role": "user",
    "content": [
      {
        "type": "text",
        "text": "now search something else"
      }
    ]
  }
]
}

Yields

{"error":{"message":"Provider returned error","code":422,"metadata":{"raw":"{\"detail\":[{\"loc\":[\"body\",\"messages\",4,\"role\"],\"msg\":\"unexpected value; permitted: <ChatMessageRole.TOOL: 'tool'>\",\"type\":\"value_error.const\",\"ctx\":{\"given\":\"assistant\",\"permitted\":[\"tool\"]}},{\"loc\":[\"body\",\"messages\",4,\"content\"],\"msg\":\"str type expected\",\"type\":\"type_error.str\"},{\"loc\":[\"body\",\"messages\",4,\"tool_call_id\"],\"msg\":\"field required\",\"type\":\"value_error.missing\"},{\"loc\":[\"body\",\"messages\",4,\"content\"],\"msg\":\"str type expected\",\"type\":\"type_error.str\"},{\"loc\":[\"body\",\"messages\",4,\"role\"],\"msg\":\"unexpected value; permitted: <ChatMessageRole.USER: 'user'>\",\"type\":\"value_error.const\",\"ctx\":{\"given\":\"assistant\",\"permitted\":[\"user\"]}},{\"loc\":[\"body\",\"messages\",4,\"role\"],\"msg\":\"unexpected value; permitted: <ChatMessageRole.SYSTEM: 'system'>\",\"type\":\"value_error.const\",\"ctx\":{\"given\":\"assistant\",\"permitted\":[\"system\"]}},{\"loc\":[\"body\",\"messages\",4,\"content\"],\"msg\":\"str type expected\",\"type\":\"type_error.str\"}]}","provider_name":"DeepInfra","isDownstreamPipeClean":true,"isErrorUpstreamFault":true}},"user_id":"*********"}

Whereas


{
  "model": "meta-llama/llama-3.3-70b-instruct",
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "google",
        "description": "A search engine optimized for comprehensive, accurate, and trusted results. Useful for when you need to answer questions about current events.",
        "parameters": {
          "type": "object",
          "properties": {
            "query": {
              "type": "string",
              "minLength": 1,
              "description": "The search query string."
            },
            "max_results": {
              "type": "number",
              "minimum": 1,
              "maximum": 10,
              "description": "The maximum number of search results to return. Defaults to 10."
            }
          },
          "required": [
            "query",
            "max_results"
          ],
          "additionalProperties": false,
          "$schema": "http://json-schema.org/draft-07/schema#"
        }
      }
    }
  ],
  "messages": [
  {
    "role": "system",
    "content": "Use Google tool if needed"
  },
  {
    "role": "user",
    "content": "search something random for me real quick, I'm testing something"
  },
  {
    "role": "assistant",
    "content": null,
    "tool_calls": [
      {
        "id": "tool_0_google",
        "type": "function",
        "function": {
          "name": "google",
          "arguments": "{\"query\":\"the history of rubber ducks\",\"max_results\":5}"
        }
      }
    ]
  },
  {
    "role": "tool",
    "content": "[OUTPUT_OMITTED_FOR_BREVITY]",
    "name": "google",
    "tool_call_id": "tool_0_google"
  },
  {
    "role": "assistant",
    "content": "Okay, I searched for \"the history of rubber ducks\" and found some interesting results. It seems rubber ducks were invented in the late 19th century, and the first ones were solid and not intended to float.\n"
  },
  {
    "role": "user",
    "content": "now search something else"
  }
]
}

Yields

{"id":"************","provider":"DeepInfra","model":"meta-llama/llama-3.3-70b-instruct","object":"chat.completion","created":1740263903,"choices":[{"logprobs":null,"finish_reason":"stop","native_finish_reason":"stop","index":0,"message":{"role":"assistant","content":"","refusal":null,"tool_calls":[{"index":0,"id":"***********","function":{"arguments":"{\"query\": \"what is the largest planet in our solar system\", \"max_results\": 5.0}","name":"google"},"type":"function"}]}}],"usage":{"prompt_tokens":443,"completion_tokens":27,"total_tokens":470}}

I was able to mostly fix it (Still fails on perplexity):
node_modules/@librechat/agents/node_modules/@langchain/openai/dist/chat_models.cjs

async completionWithRetry(request, options) {
(...)

                const { messages, ...others } = request
                const newMessages = messages
                    .map((message) => {
                        let textContent = message.content;
                        if (Array.isArray(message.content) && message.content.length === 1 && message.content[0].type === 'text') {
                            textContent = message.content[0].text;
                        }
                        return { ...message, content: textContent }
                    })
                const res = await this.client.chat.completions.create({ ...request, messages: newMessages }, requestOptions);

Version Information

# git rev-parse HEAD
12605516902179808d2dda495764a386438dcd62

Steps to Reproduce

  1. Create an agent with the Google tool (I used google/gemini-2.0-flash-001 as the model)
  2. Ask it to search something
  3. Switch to another LLM like meta-llama/llama-3.3-70b-instruct
  4. Send any message

What browsers are you seeing the problem on?

Firefox

Relevant log output

Screenshots

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@Kristovich Kristovich added the 🐛 bug Something isn't working label Feb 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant