-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Qwen2-7B-Instruct Langchain 接入的问题 #225
Comments
因为你是在直接在notebook文件中运行的,那么直接使用Qwen2_LLM即可。 |
嗷,原来如此,也就是说提供了两种方式去进行问答,比如你可以直接在notebook中调用Qwen2_LLM去进行问答,也可以导入.py文件进行问答,我理解的对嘛?
…------------------ 原始邮件 ------------------
发件人: "datawhalechina/self-llm" ***@***.***>;
发送时间: 2024年7月29日(星期一) 中午1:58
***@***.***>;
***@***.******@***.***>;
主题: Re: [datawhalechina/self-llm] Qwen2-7B-Instruct Langchain 接入的问题 (Issue #225)
因为你是在直接在notebook文件中运行的,那么直接使用Qwen2_LLM即可。
导入的方式,是将上述文件存放为py文件导入的。
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
是的 |
好的,非常感谢解惑
…------------------ 原始邮件 ------------------
发件人: "datawhalechina/self-llm" ***@***.***>;
发送时间: 2024年7月29日(星期一) 下午2:07
***@***.***>;
***@***.******@***.***>;
主题: Re: [datawhalechina/self-llm] Qwen2-7B-Instruct Langchain 接入的问题 (Issue #225)
是的
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
我在.ipynb文件中,运行了自定义 LLM 类
from langchain.llms.base import LLM
from typing import Any, List, Optional
from langchain.callbacks.manager import CallbackManagerForLLMRun
from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig, LlamaTokenizerFast
import torch
class Qwen2_LLM(LLM):
# 基于本地 Qwen2 自定义 LLM 类
tokenizer: AutoTokenizer = None
model: AutoModelForCausalLM = None
然后,当我在调用Qwen2_LLM时,却报错了
from LLM import Qwen2_LLM
llm = Qwen2_LLM(mode_name_or_path = "/root/autodl-tmp/qwen/Qwen2-7B-Instruct")
print(llm("你是谁"))
上面说ModuleNotFoundError: No module named 'LLM',没有这个模型,但应该是有的啊,请问大家该如何修改呢?
The text was updated successfully, but these errors were encountered: