Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

咨询 关于 微调时的上下文长度 #2

Open
Davidgzx opened this issue Jul 16, 2024 · 0 comments
Open

咨询 关于 微调时的上下文长度 #2

Davidgzx opened this issue Jul 16, 2024 · 0 comments

Comments

@Davidgzx
Copy link

你好,我看你们开源了基于Qwen2-7B的模型,
image
我观察到你们拼接了训练样本,请问拼接前后的样本长度是怎么样控制的呢。你们的截断长度(context length)有多长呢。千问2应该是32678长度进行的训练

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant