-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump quickjs to 2024 #133
Bump quickjs to 2024 #133
Conversation
Hello, I am a code review bot on flows.network. Here are my reviews of code commits in this PR. Overall Summary: First, there are dependency updates, including a changed checksum for the base64 crate. This may introduce compatibility issues or bugs, so it is crucial to review the changes in the base64 crate to ensure it won't cause any problems. Additionally, the updates to several dependencies may introduce compatibility issues or breaking changes, so it's important to review the changelogs or release notes for each dependency. Next, there are changes related to the development environment setup. The removal of the There are also changes related to the addition of new files and directories, such as Furthermore, there are changes made to specific code files, such as Finally, there is a fix for an issue with the In summary, this pull request introduces several important changes and potential issues that need to be addressed. It's crucial to review the changes in base64 crate, dependency updates, development environment setup, specific code modifications, and the introduction of DetailsCommit d8e70af5bc103e448da01d4608d14332a4183a62The key changes in this patch are:
Potential problems to note:
Commit 7ca0db86c34c09304243836ebaaeee1f8b9d91d8Key changes in the patch:
Potential problems:
These are the most important findings and potential issues in the patch. Commit 6d0be6d33d978ca83d037754a4e45a9380c9b082The key changes in this patch include:
In
In
There don't appear to be any potential problems with these changes. Commit cb444fef94d90737619b293f89427776d3d0a096Key changes:
Potential problems:
Overall, the change appears to be a fix for the Commit c39f8e3b6503e7d1c0e76f534fcf5aa826265641Key Changes:
Potential Problems:
Overall, it is important to address the potential problems and clarify the intentions behind the changes made in this patch. Commit 17128409cda50459b4559f559c98a736233dfb25Key changes:
Potential problems:
Overall, the patch seems to address a specific issue and makes a minor change to the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add a CI example with TinyLlama? Thanks.
https://huggingface.co/second-state/TinyLlama-1.1B-Chat-v1.0-GGUF
@@ -0,0 +1,57 @@ | |||
import { GGMLChatCompletionRequest, GGMLChatPrompt } from '_wasi_nn_ggml_template' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this package private or internal? Why do we have _ in the package name?
"repeat-penalty": 1.1 | ||
} | ||
|
||
let graph = build_graph_from_cache(3, JSON.stringify(opt), "default") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does the first argument 3
mean?
|
||
let req = new GGMLChatCompletionRequest() | ||
|
||
let messages = ['hello', 'who are you?'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would like to show an interactive example here -- like llama-chat in LlamaEdge.
|
||
while (1) { | ||
try { | ||
context.compute_single() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use streaming responses? In fact, we could have two examples. One uses the simple response and the other uses streaming responses.
print("[YOU]:", messages[i]) | ||
req.push_message("user", messages[i]) | ||
let p = template.build(req) | ||
context.set_input(0, p, [1], 3) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do those numbers in the call parameters mean?
|
||
let template = new GGMLChatPrompt('llama-2-chat') | ||
|
||
let req = new GGMLChatCompletionRequest() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can I set an optional system prompt here?
|
||
for (var i in messages) { | ||
print("[YOU]:", messages[i]) | ||
req.push_message("user", messages[i]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the user
here is a special string? Maybe we should turn it into a const
to avoid mis-spelling etc.
} | ||
} | ||
} | ||
req.push_message("assistant", ss) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as the user
comment above, but for assistant
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We decided to postpone the GGML examples to a later PR.
No description provided.