Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File input #2270

Merged
merged 22 commits into from
Aug 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
"i18n-ally.namespace": true,
"i18n-ally.pathMatcher": "{locale}/{namespaces}.json",
"i18n-ally.extract.targetPickingStrategy": "most-similar-by-key",
"i18n-ally.translate.engines": ["deepl", "google"],
"[typescript]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
}
Expand Down
45 changes: 45 additions & 0 deletions docSite/content/zh-cn/docs/development/upgrading/489.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
title: 'V4.8.9(进行中)'
description: 'FastGPT V4.8.9 更新说明'
icon: 'upgrade'
draft: false
toc: true
weight: 816
---

## 升级指南

### 1. 做好数据库备份

### 2. 修改镜像


### 3. 执行初始化

从任意终端,发起 1 个 HTTP 请求。其中 {{rootkey}} 替换成环境变量里的 `rootkey`;{{host}} 替换成**FastGPT 商业版域名**。

```bash
curl --location --request POST 'https://{{host}}/api/admin/init/489' \
--header 'rootkey: {{rootkey}}' \
--header 'Content-Type: application/json'
```

会初始化多租户的通知方式

-------

## V4.8.9 更新说明

1. 新增 - 文件上传配置,不再依赖视觉模型决定是否可上传图片,而是通过系统配置决定。
2. 新增 - AI 对话节点和工具调用支持选择“是否开启图片识别”,开启后会自动获取对话框上传的图片和“用户问题”中的图片链接。
3. 新增 - 文档解析节点。
4. 商业版新增 - 团队通知账号绑定,用于接收重要信息。
5. 商业版新增 - 知识库集合标签功能,可以对知识库进行标签管理。
6. 商业版新增 - 知识库搜索节点支持标签过滤和创建时间过滤。
7. 新增 - 删除所有对话引导内容。
8. 优化 - 对话框信息懒加载,减少网络传输。
9. 修复 - 知识库上传文件,网络不稳定或文件较多情况下,进度无法到 100%。
10. 修复 - 删除应用后回到聊天选择最后一次对话的应用为删除的应用时提示无该应用问题。
11. 修复 - 插件动态变量配置默认值时,无法正常显示默认值。
12. 修复 - 工具调用温度和最大回复值未生效。
13. 修复 - 函数调用模式,assistant role 中,GPT 模型必须传入 content 参数。(不影响大部分模型,目前基本都改用用 ToolChoice 模式,FC 模式已弃用)
12 changes: 10 additions & 2 deletions packages/global/common/file/constants.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,19 @@
import { i18nT } from '../../../web/i18n/utils';

/* mongo fs bucket */
export enum BucketNameEnum {
dataset = 'dataset'
dataset = 'dataset',
chat = 'chat'
}
export const bucketNameMap = {
[BucketNameEnum.dataset]: {
label: 'file.bucket.dataset'
label: i18nT('file:bucket_file')
},
[BucketNameEnum.chat]: {
label: i18nT('file:bucket_chat')
}
};

export const ReadFileBaseUrl = '/api/common/file/read';

export const documentFileType = '.txt, .docx, .csv, .xlsx, .pdf, .md, .html, .pptx';
1 change: 1 addition & 0 deletions packages/global/common/file/type.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ export type FileTokenQuery = {
teamId: string;
tmbId: string;
fileId: string;
expiredTime?: number;
};
7 changes: 7 additions & 0 deletions packages/global/common/string/tools.ts
Original file line number Diff line number Diff line change
Expand Up @@ -91,3 +91,10 @@ export const sliceJsonStr = (str: string) => {

return jsonStr;
};

export const sliceStrStartEnd = (str: string, start: number, end: number) => {
const overSize = str.length > start + end;
const startContent = str.slice(0, start);
const endContent = overSize ? str.slice(-end) : '';
return startContent + (overSize ? ` ...... ` : '') + endContent;
};
7 changes: 7 additions & 0 deletions packages/global/core/ai/prompt/AIChat.ts
Original file line number Diff line number Diff line change
Expand Up @@ -119,3 +119,10 @@ export const Prompt_QuotePromptList: PromptTemplateItem[] = [
问题:"""{{question}}"""`
}
];

// Document quote prompt
export const Prompt_DocumentQuote = `将 <Quote></Quote> 中的内容作为你的知识:
<Quote>
{{quote}}
</Quote>
`;
40 changes: 33 additions & 7 deletions packages/global/core/ai/type.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,46 @@ import openai from 'openai';
import type {
ChatCompletionMessageToolCall,
ChatCompletionChunk,
ChatCompletionMessageParam,
ChatCompletionMessageParam as SdkChatCompletionMessageParam,
ChatCompletionToolMessageParam,
ChatCompletionAssistantMessageParam
ChatCompletionAssistantMessageParam,
ChatCompletionContentPart as SdkChatCompletionContentPart,
ChatCompletionUserMessageParam as SdkChatCompletionUserMessageParam
} from 'openai/resources';
import { ChatMessageTypeEnum } from './constants';

export * from 'openai/resources';

export type ChatCompletionMessageParam = ChatCompletionMessageParam & {
// Extension of ChatCompletionMessageParam, Add file url type
export type ChatCompletionContentPartFile = {
type: 'file_url';
name: string;
url: string;
};
// Rewrite ChatCompletionContentPart, Add file type
export type ChatCompletionContentPart =
| SdkChatCompletionContentPart
| ChatCompletionContentPartFile;
type CustomChatCompletionUserMessageParam = {
content: string | Array<ChatCompletionContentPart>;
role: 'user';
name?: string;
};

export type ChatCompletionMessageParam = (
| Exclude<SdkChatCompletionMessageParam, SdkChatCompletionUserMessageParam>
| CustomChatCompletionUserMessageParam
) & {
dataId?: string;
};
export type SdkChatCompletionMessageParam = SdkChatCompletionMessageParam;

/* ToolChoice and functionCall extension */
export type ChatCompletionToolMessageParam = ChatCompletionToolMessageParam & { name: string };
export type ChatCompletionAssistantToolParam = {
role: 'assistant';
tool_calls: ChatCompletionMessageToolCall[];
};

export type ChatCompletionMessageToolCall = ChatCompletionMessageToolCall & {
toolName?: string;
toolAvatar?: string;
Expand All @@ -28,13 +51,16 @@ export type ChatCompletionMessageFunctionCall = ChatCompletionAssistantMessagePa
toolName?: string;
toolAvatar?: string;
};

// Stream response
export type StreamChatType = Stream<ChatCompletionChunk>;

export default openai;
export * from 'openai';

// Other
export type PromptTemplateItem = {
title: string;
desc: string;
value: string;
};

export default openai;
export * from 'openai';
8 changes: 7 additions & 1 deletion packages/global/core/app/constants.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { AppTTSConfigType, AppWhisperConfigType } from './type';
import { AppTTSConfigType, AppFileSelectConfigType, AppWhisperConfigType } from './type';

export enum AppTypeEnum {
folder = 'folder',
Expand All @@ -23,3 +23,9 @@ export const defaultChatInputGuideConfig = {
textList: [],
customUrl: ''
};

export const defaultAppSelectFileConfig: AppFileSelectConfigType = {
canSelectFile: false,
canSelectImg: false,
maxFiles: 10
};
10 changes: 9 additions & 1 deletion packages/global/core/app/type.d.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import type { FlowNodeTemplateType, StoreNodeItemType } from '../workflow/type/node';
import { AppTypeEnum } from './constants';
import { PermissionTypeEnum } from '../../support/permission/constant';
import { VariableInputEnum } from '../workflow/constants';
import { NodeInputKeyEnum, VariableInputEnum } from '../workflow/constants';
import { SelectedDatasetType } from '../workflow/api';
import { DatasetSearchModeEnum } from '../dataset/constants';
import { TeamTagSchema as TeamTagsSchemaType } from '@fastgpt/global/support/user/team/type.d';
Expand Down Expand Up @@ -91,13 +91,15 @@ export type AppChatConfigType = {
whisperConfig?: AppWhisperConfigType;
scheduledTriggerConfig?: AppScheduledTriggerConfigType;
chatInputGuide?: ChatInputGuideConfigType;
fileSelectConfig?: AppFileSelectConfigType;
};
export type SettingAIDataType = {
model: string;
temperature: number;
maxToken: number;
isResponseAnswerText?: boolean;
maxHistories?: number;
[NodeInputKeyEnum.aiChatVision]?: boolean; // Is open vision mode
};

// variable
Expand Down Expand Up @@ -134,3 +136,9 @@ export type AppScheduledTriggerConfigType = {
timezone: string;
defaultPrompt: string;
};
// File
export type AppFileSelectConfigType = {
canSelectFile: boolean;
canSelectImg: boolean;
maxFiles: number;
};
33 changes: 24 additions & 9 deletions packages/global/core/chat/adapt.ts
Original file line number Diff line number Diff line change
Expand Up @@ -56,16 +56,21 @@ export const chats2GPTMessages = ({
text: item.text?.content || ''
};
}
if (
item.type === ChatItemValueTypeEnum.file &&
item.file?.type === ChatFileTypeEnum.image
) {
return {
type: 'image_url',
image_url: {
if (item.type === ChatItemValueTypeEnum.file) {
if (item.file?.type === ChatFileTypeEnum.image) {
return {
type: 'image_url',
image_url: {
url: item.file?.url || ''
}
};
} else if (item.file?.type === ChatFileTypeEnum.file) {
return {
type: 'file_url',
name: item.file?.name || '',
url: item.file?.url || ''
}
};
};
}
}
})
.filter(Boolean) as ChatCompletionContentPart[];
Expand Down Expand Up @@ -175,6 +180,16 @@ export const GPTMessages2Chats = (
url: item.image_url.url
}
});
} else if (item.type === 'file_url') {
value.push({
// @ts-ignore
type: ChatItemValueTypeEnum.file,
file: {
type: ChatFileTypeEnum.file,
name: item.name,
url: item.url
}
});
}
});
}
Expand Down
1 change: 1 addition & 0 deletions packages/global/core/chat/type.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ export type ChatItemType = (UserChatItemType | SystemChatItemType | AIChatItemTy
dataId?: string;
} & ResponseTagItemType;

// Frontend type
export type ChatSiteItemType = (UserChatItemType | SystemChatItemType | AIChatItemType) & {
dataId: string;
status: `${ChatStatusEnum}`;
Expand Down
56 changes: 36 additions & 20 deletions packages/global/core/chat/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import { DispatchNodeResponseType } from '../workflow/runtime/type';
import { FlowNodeTypeEnum } from '../workflow/node/constant';
import { ChatItemValueTypeEnum, ChatRoleEnum } from './constants';
import { ChatHistoryItemResType, ChatItemType, UserChatItemValueItemType } from './type.d';
import { sliceStrStartEnd } from '../../common/string/tools';

// Concat 2 -> 1, and sort by role
export const concatHistories = (histories1: ChatItemType[], histories2: ChatItemType[]) => {
Expand All @@ -25,37 +26,52 @@ export const getChatTitleFromChatMessage = (message?: ChatItemType, defaultValue
return defaultValue;
};

// Keep the first n and last n characters
export const getHistoryPreview = (
completeMessages: ChatItemType[]
): {
obj: `${ChatRoleEnum}`;
value: string;
}[] => {
return completeMessages.map((item, i) => {
if (item.obj === ChatRoleEnum.System || i >= completeMessages.length - 2) {
return {
obj: item.obj,
value: item.value?.[0]?.text?.content || ''
};
}
const n = item.obj === ChatRoleEnum.System || i >= completeMessages.length - 2 ? 80 : 40;

const content = item.value
.map((item) => {
if (item.text?.content) {
const content =
item.text.content.length > 20
? `${item.text.content.slice(0, 20)}...`
: item.text.content;
return content;
}
return '';
})
.filter(Boolean)
.join('\n');
// Get message text content
const rawText = (() => {
if (item.obj === ChatRoleEnum.System) {
return item.value?.map((item) => item.text?.content).join('') || '';
} else if (item.obj === ChatRoleEnum.Human) {
return (
item.value
?.map((item) => {
if (item?.text?.content) return item?.text?.content;
if (item.file?.type === 'image') return 'Input an image';
return '';
})
.filter(Boolean)
.join('\n') || ''
);
} else if (item.obj === ChatRoleEnum.AI) {
return (
item.value
?.map((item) => {
return (
item.text?.content || item?.tools?.map((item) => item.toolName).join(',') || ''
);
})
.join('') || ''
);
}
return '';
})();

const startContent = rawText.slice(0, n);
const endContent = rawText.length > 2 * n ? rawText.slice(-n) : '';
const content = startContent + (rawText.length > n ? ` ...... ` : '') + endContent;

return {
obj: item.obj,
value: content
value: sliceStrStartEnd(content, 80, 80)
};
});
};
Expand Down
10 changes: 9 additions & 1 deletion packages/global/core/workflow/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,8 @@ export enum NodeInputKeyEnum {
aiChatQuoteTemplate = 'quoteTemplate',
aiChatQuotePrompt = 'quotePrompt',
aiChatDatasetQuote = 'quoteQA',
aiChatVision = 'aiChatVision',
stringQuoteText = 'stringQuoteText',

// dataset
datasetSelectList = 'datasets',
Expand Down Expand Up @@ -118,7 +120,10 @@ export enum NodeInputKeyEnum {

// code
code = 'code',
codeType = 'codeType' // js|py
codeType = 'codeType', // js|py

// read files
fileUrlList = 'fileUrlList'
}

export enum NodeOutputKeyEnum {
Expand All @@ -133,6 +138,9 @@ export enum NodeOutputKeyEnum {
addOutputParam = 'system_addOutputParam',
rawResponse = 'system_rawResponse',

// start
userFiles = 'userFiles',

// dataset
datasetQuoteQA = 'quoteQA',

Expand Down
Loading
Loading