We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
i added hugging face read api key, but when i try to run a query i get a 503
The text was updated successfully, but these errors were encountered:
INFO:server.lib.inference:Requesting inference from databricks/dolly-v2-12b on huggingface INFO:werkzeug:127.0.0.1 - - [24/Apr/2023 23:37:51] "POST /api/inference/text/stream HTTP/1.1" 200 - ERROR:server.lib.inference:Error: Request failed: 503 Service Unavailable
Sorry, something went wrong.
how do i fix?
seems like this is the huggingface API, you'll have to wait a bit for the model to "warm up" before you can call for inference. https://huggingface.co/docs/api-inference/faq#:~:text=Rate%20limits&text=We%20try%20to%20balance%20the,errors%20saying%20models%20are%20loading.
No branches or pull requests
i added hugging face read api key, but when i try to run a query i get a 503
The text was updated successfully, but these errors were encountered: