I want to use python onnxruntime within a flutter application to load machine learning models within device. #5225
Replies: 3 comments 2 replies
-
At my last look at Onnx on embedded devices/Android/iOS, you needed to compile binaries from source(ie. they haven't got arm64-v8a binaries). I would build that wheel (C++), create the Python bindings, and then have it contained within your package. I think the key question from a Flet POV would be whether you could place something like this within your reqs.txt / .toml requirements when working on an Android build: I would guess that you can - as you can on MacOS/Windows, and I use custom wheels/binaries. But I haven't done Flet -> Android. Answering your question, I think the most relevant link to start this journey is this: |
Beta Was this translation helpful? Give feedback.
-
I tested it today, https://github.com/SamYuan1990/i18n-agent-action/actions/runs/17670451604/job/50220801655?pr=104 related with microsoft/onnxruntime#26025 I am not sure if onnx runtime can provide a package. |
Beta Was this translation helpful? Give feedback.
-
well, for me I am attempting sherpa_onnx for STT(sound to text). hope it help, https://github.com/SamYuan1990/flet_sherpa_onnx |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I tried to install onnxruntime but its not resolved, So i wanted to be clear either.
Beta Was this translation helpful? Give feedback.
All reactions