Skip to content

Add mistral-common library #1641

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

juliendenize
Copy link

@juliendenize juliendenize commented Jul 18, 2025

Hi !

We'd like to add vLLM as a library and track the downloads via two possible formats:

  • Transformers with the config.json
  • Mistral with the params.json

The code snippet suggested is through serving.

Edit:
Based on @Wauplin comment we changed the integration to instead add the mistral-common library

@Wauplin
Copy link
Contributor

Wauplin commented Jul 18, 2025

Hi @juliendenize , thanks for the PR. In the ecosystem we consider vLLM as a Local App rather than a library. If you go on https://huggingface.co/mistralai/Mistral-Small-24B-Base-2501?local-app=vllm for instance, you'll see how to serve the model using vLLM. If you are interested in counting the downloads of model repos based on mistral-common, I would suggest to register mistral-common instead. PR would look like this:

# (to add in alphabetical order) 
	"mistral-common": {
		prettyLabel: "mistral-common",
		repoName: "mistral-common",
		repoUrl: "https://github.com/mistralai/mistral-common",
		docsUrl: "https://mistralai.github.io/mistral-common/",
		countDownloads: `path:"config.json" OR path:"params.json"`,
	},

(I haven't added a code snippet as I'm unsure how to use mistral common)

You'll also have to add library_name: mistral-common in the model card metadata of your models.

@Wauplin
Copy link
Contributor

Wauplin commented Jul 18, 2025

Regarding the PR itself, we won't merge it in its current form. vLLM is a widely used app that is not specific to mistral-common (see https://huggingface.co/models?other=vllm) so we don't want the snippet to include mistral-specific instructions for all of them.

@Wauplin
Copy link
Contributor

Wauplin commented Jul 18, 2025

Btw, vllm snippets are defined here. It is possible to update the install command to include pip install -U mistral-common for models with mistral-common tag (e.g. if model.tags.includes("mistral-common") ...)

@juliendenize juliendenize changed the title Add vLLM library Add mistral-common library Jul 18, 2025
@juliendenize
Copy link
Author

Hi @Wauplin thanks a lot for your detailed feedback and given me the context and the pointers ! much appreciated.

so we don't want the snippet to include mistral-specific instructions for all of them.

This was indeed also a concern for me when I pushed the original PR, and also why I put Mistral related instruction in comments. So, very cool that your proposed solution avoids that.

I updated the PR accordingly.

I just have a question regarding how tags works, is the library_name considered as a tag ? meaning would that be True:

model.tags.includes("mistral-common") // = True ?

Or should I update the PR to check the library_name ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants