-
-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support gpt-4o-audio #2032
base: main
Are you sure you want to change the base?
Conversation
对于 stream 请求,请求体中加入
|
- Refactor model name handling across multiple controllers to improve clarity and maintainability. - Enhance error logging and handling for better debugging and request processing robustness. - Update pricing models in accordance with new calculations, ensuring accuracy in the billing logic.
@@ -82,6 +83,27 @@ func (a *Adaptor) ConvertRequest(c *gin.Context, relayMode int, request *model.G | |||
} | |||
request.StreamOptions.IncludeUsage = true | |||
} | |||
|
|||
// o1/o1-mini/o1-preview do not support system prompt and max_tokens | |||
if strings.HasPrefix(request.Model, "o1") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里是不是没有必要?用户既然在调用 o1,就应该遵循 OpenAI 的规则
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
主要是减轻前端的负担,因为很多时候前端都是套了一层 chat ui,用户会随意在各个 models 间切换,如果除了 model 外还有太多其他限制的话,前端的实现会复杂很多。
功能
close #2022
支持 gpt-4o-audio 系列的 chat completetions
暂时没搞懂 stream 的计费模式,所以只支持了
stream==false
合并前可以使用 https://github.com/Laisky/one-api 来体验这个功能。
依赖
本提交也包含了 #2022
自测