-
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are there any limitations to running more than one model at a time? #39
Comments
I tried loading two models in the NanoSAM example without any problem. @lxbhahaha Can you make a minimal reproducible project? I would like to know your use case in code. |
I uploaded the problematic part of the project to TestOnnx_250117. Here are lines 30 to 37 of the code in file Assets/Scripts/DemoTest.cs.
It is currently work fine, But if you uncomment lines 36 or 37, demoYolo will no longer get the result. If you just uncomment lines 30 or 31, demoyolo will still get the results. It seems that only the most recent model can get results, and I can't figure out why this problem occurs. |
@lxbhahaha, thanks for providing the repro project. That is weird. We may need to test with the original C# library to isolate whether it is a Unity or the original OnnxRuntime issue. |
When I try to create two new models at the same time (for example, two yolox), the inference of the first model is lost, and the result of the second model is normal. Can I only run one model at a time? Or did I do something wrong?
The text was updated successfully, but these errors were encountered: