Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API: add tokenize and tokenize/count routes #7

Closed
celsowm opened this issue Mar 11, 2025 · 0 comments
Closed

API: add tokenize and tokenize/count routes #7

celsowm opened this issue Mar 11, 2025 · 0 comments

Comments

@celsowm
Copy link

celsowm commented Mar 11, 2025

Hi!
I used to be a llama cpp python[server] user but now I am trying to migrate to SGLang !

I know v1/openai standard is very limited so each inference server app is trying to extended the routes to facilitate theirs users life.

So as a new user I would like to ask/suggest two new routes for SGLang server:

tokenize: very simple and direct returning the array of tokens.

tokenize/count: optional but useful, the total of tokens (if not at least a simple len(tokenize return) will be enough in client side.

Thanks in advance
Celso

@celsowm celsowm closed this as completed Mar 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant