Skip to content

Commit ec8e251

Browse files
authored
change gpt-4-turbo maxResponse configuration template (#814)
用最新的配置文件4.6.8 ,对话选gpt-4-turbo 报错: null max_tokens is too large: 62500. This model supports at most 4096 completion tokens, whereas you provided 62500. (request id: 20240202110253407344738SmDnkwX1) 原因是官方gpt-4-turbo 最大的返回token 4096.
1 parent 34602b2 commit ec8e251

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docSite/content/docs/development/configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -224,7 +224,7 @@ llm模型全部合并
224224
"model": "gpt-4-0125-preview",
225225
"name": "gpt-4-turbo",
226226
"maxContext": 125000,
227-
"maxResponse": 125000,
227+
"maxResponse": 4000,
228228
"quoteMaxToken": 100000,
229229
"maxTemperature": 1.2,
230230
"inputPrice": 0,

0 commit comments

Comments
 (0)