fix(kleidiai-llm-chatbot): add command to install specific httpx vers… #1895
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I added a command to roll the httpx version to 0.27.2.
I noticed there is an issue when planning a workshop that is based on the following Learning Path: https://learn.arm.com/learning-paths/servers-and-cloud-computing/pytorch-llama/. If the httpx isn't rolled back to an version prior to 0.28 the Streamlit frontend will throw a "proxies" error when you try to send a message to the chatbot.
I know this is probably a band-aid fix, but because I will be sending people to try this Learning Path after they complete the workshop I wanted them to be able to successfully complete all the steps.
Before submitting a pull request for a new Learning Path, please review Create a Learning Path
Please do not include any confidential information in your contribution. This includes confidential microarchitecture details and unannounced product information.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of the Creative Commons Attribution 4.0 International License.