Skip to content

Commit 1564e23

Browse files
committed
Support Llama-3.3
1 parent c2a3f05 commit 1564e23

File tree

2 files changed

+64
-72
lines changed

2 files changed

+64
-72
lines changed

README.md

+61-70
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,10 @@
11
# Chat UI
22

3-
<p align="left">
4-
<a href="https://vuejs.org/">
5-
<img src="https://img.shields.io/badge/Vue3-brightgreen.svg" alt="vue">
6-
</a>
7-
&nbsp
8-
<a href="https://vuetifyjs.com/">
9-
<img src="https://img.shields.io/badge/Vuetify-blue.svg" alt="element-ui">
10-
</a>
11-
&nbsp
12-
<a>
13-
<img src="https://img.shields.io/badge/HTML-red.svg">
14-
</a>
15-
&nbsp
16-
<a href="https://hub.docker.com/repository/docker/aiql/chat-ui/tags?page=1&ordering=last_updated">
17-
<img src="https://img.shields.io/badge/Docker-lightskyblue.svg">
18-
</a>
19-
&nbsp
20-
<a href="https://github.com/AI-QL/chat-ui/blob/main/LICENSE">
21-
<img src="https://img.shields.io/github/license/AI-QL/chat-ui" alt="license">
22-
</a>
23-
</p>
3+
[![](https://img.shields.io/badge/Vue3-brightgreen.svg)](https://vuejs.org)
4+
[![](https://img.shields.io/badge/Vuetify-blue.svg)](https://vuetifyjs.com)
5+
![](https://img.shields.io/badge/HTML-red.svg)
6+
[![Docker Pulls](https://img.shields.io/docker/pulls/aiql/chat-ui.svg)](https://hub.docker.com/repository/docker/aiql/chat-ui/tags?page=1&ordering=last_updated)
7+
[![LICENSE](https://img.shields.io/github/license/AI-QL/chat-ui)](https://github.com/AI-QL/chat-ui/blob/main/LICENSE)
248

259
The UI of Chat is becoming increasingly complex, often encompassing an entire front-end project along with deployment solutions.
2610

@@ -48,8 +32,12 @@ By simplifying the structure and key functions, developers can quickly set up an
4832

4933
## How to use
5034

51-
#### Option 1: Goto demo [AIQL](https://chat.aiql.com/)
52-
> The demo will use `Llama-3.2` by default, image upload is only supported for vision models
35+
#### Option 1: Chat with demo [AIQL](https://chat.aiql.com/)
36+
> The demo will use `Llama-3.3-70B-Instruct` by default
37+
38+
> Multimodal image upload is only supported for vision models
39+
40+
> MCP tools call necessitates a desktop backend and LLM support in OpenAI format, referencing [Chat-MCP](https://github.com/AI-QL/chat-mcp)
5341
5442
#### Option 2: Download [Index](./index.html) and open it locally (recommended)
5543

@@ -61,7 +49,7 @@ python3 -m http.server 8000
6149
> Then, open your browser and access `http://localhost:8000`
6250
6351
#### Option 4: fork this repo and link it to [Cloudflare pages](https://developers.cloudflare.com/pages)
64-
> Demo https://www2.aiql.com
52+
> Demo: https://www2.aiql.com
6553
6654
#### Option 5: Deploy your own Chatbot by [Docker](https://hub.docker.com/repository/docker/aiql/chat-ui/tags?page=1&ordering=last_updated)
6755
```shell
@@ -104,57 +92,60 @@ If you're experiencing issues opening the page and a simple refresh isn't resolv
10492
## K8s
10593

10694
1. Introduce the image as sidecar container
107-
```yaml
108-
spec:
109-
template:
110-
metadata:
111-
labels:
112-
app: my-app
95+
96+
```yaml
11397
spec:
114-
containers:
115-
- name: chat-ui
116-
image: aiql/chat-ui
117-
ports:
118-
- containerPort: 8080
119-
```
98+
template:
99+
metadata:
100+
labels:
101+
app: my-app
102+
spec:
103+
containers:
104+
- name: chat-ui
105+
image: aiql/chat-ui
106+
ports:
107+
- containerPort: 8080
108+
```
120109
121110
2. Add service
122-
```yaml
123-
apiVersion: v1
124-
kind: Service
125-
metadata:
126-
name: chat-ui-service
127-
spec:
128-
selector:
129-
app: my-app
130-
ports:
131-
- protocol: TCP
132-
port: 8080
133-
targetPort: 8080
134-
type: LoadBalancer
135-
```
111+
112+
```yaml
113+
apiVersion: v1
114+
kind: Service
115+
metadata:
116+
name: chat-ui-service
117+
spec:
118+
selector:
119+
app: my-app
120+
ports:
121+
- protocol: TCP
122+
port: 8080
123+
targetPort: 8080
124+
type: LoadBalancer
125+
```
136126
137127
3. You can access the port or add other ingress
138-
```yaml
139-
apiVersion: networking.k8s.io/v1
140-
kind: Ingress
141-
metadata:
142-
name: my-app-ingress
143-
annotations:
144-
nginx.ingress.kubernetes.io/rewrite-target: /$1
145-
spec:
146-
rules:
147-
- host: chat-ui.example.com
148-
http:
149-
paths:
150-
- path: /
151-
pathType: Prefix
152-
backend:
153-
service:
154-
name: chat-ui-service
155-
port:
156-
number: 8080
157-
```
128+
129+
```yaml
130+
apiVersion: networking.k8s.io/v1
131+
kind: Ingress
132+
metadata:
133+
name: my-app-ingress
134+
annotations:
135+
nginx.ingress.kubernetes.io/rewrite-target: /$1
136+
spec:
137+
rules:
138+
- host: chat-ui.example.com
139+
http:
140+
paths:
141+
- path: /
142+
pathType: Prefix
143+
backend:
144+
service:
145+
name: chat-ui-service
146+
port:
147+
number: 8080
148+
```
158149
159150
## Demo
160151
![](./demo.gif)

index.html

+3-2
Original file line numberDiff line numberDiff line change
@@ -1570,11 +1570,12 @@ <h5 class="font-weight-bold">{{ column.key }}</h5>
15701570
"/v1/openai/chat/completions",
15711571
"/openai/v1/chat/completions"],
15721572
model: [
1573+
"meta-llama/Llama-3.3-70B-Instruct",
15731574
"Qwen/Qwen2.5-72B-Instruct",
15741575
"meta-llama/Llama-3.2-11B-Vision-Instruct",
15751576
"meta-llama/Llama-3.2-90B-Vision-Instruct",
1576-
"meta-llama/Meta-Llama-3.1-70B-Instruct",
15771577
"meta-llama/Meta-Llama-3.1-8B-Instruct",
1578+
"meta-llama/Meta-Llama-3.1-70B-Instruct",
15781579
"mistralai/Mistral-7B-Instruct-v0.3",
15791580
"mistralai/Mistral-Nemo-Instruct-2407",
15801581
],
@@ -1604,7 +1605,7 @@ <h5 class="font-weight-bold">{{ column.key }}</h5>
16041605
apiKey: "",
16051606
url: "https://api2.aiql.com",
16061607
path: "/chat/completions",
1607-
model: "meta-llama/Llama-3.2-90B-Vision-Instruct",
1608+
model: "meta-llama/Llama-3.3-70B-Instruct",
16081609
authPrefix: "Bearer",
16091610
contentType: "application/json",
16101611
max_tokens_type: "max_tokens",

0 commit comments

Comments
 (0)