Added MAUI usage example (Android) #1217
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Added MAUI example (Android)
Disclaimer
I’m new to the project and I hope I have followed all the contribution guidelines and policies correctly. If not, please forgive me and kindly let me know what I should fix or improve.
Context
As suggested by @AmSmart in PR #1179, I extended the Mobile project developing a chatbot as a basic working app example using LlamaSharp on MAUI.
Important note on functionality (ISSUE)
I noticed that the example works correctly on an Android emulator (running on PC), but on a real Android device it crashes with the following error related to loading the
CommunityToolkit.HighPerformance.dll
dependency:@AmSmart, could you please check what is going on here?
A simple idea from building the app
While developing the app, it occurred to me that it might be useful to provide an API like LLamaWeights.LoadFromStream to load the model directly from a stream. This could be handy in cases where a small model is bundled with the APK. Currently, since loading requires a file, the model must be extracted from the APK and saved to the device storage, resulting in having two copies: one compressed inside the APK and one extracted. With a stream-based load, the app could load the model directly from the APK without extracting it. I understand that in a real-world scenario the model probably won't be shipped with the APK, but I thought it was an interesting possibility and wanted to hear your thoughts on this.