Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are there any limitations to running more than one model at a time? #39

Open
lxbhahaha opened this issue Jan 17, 2025 · 3 comments
Open

Comments

@lxbhahaha
Copy link

When I try to create two new models at the same time (for example, two yolox), the inference of the first model is lost, and the result of the second model is normal. Can I only run one model at a time? Or did I do something wrong?

@asus4
Copy link
Owner

asus4 commented Jan 17, 2025

I tried loading two models in the NanoSAM example without any problem.

@lxbhahaha Can you make a minimal reproducible project? I would like to know your use case in code.

@lxbhahaha
Copy link
Author

I tried loading two models in the NanoSAM example without any problem.

@lxbhahaha Can you make a minimal reproducible project? I would like to know your use case in code.

I uploaded the problematic part of the project to TestOnnx_250117.

Here are lines 30 to 37 of the code in file Assets/Scripts/DemoTest.cs.

30    // yolo1 = new DemoYoloxHumanDetect();
31    // hrnet1 = new DemoHRNet();
32
33    ImageAcquisiton.Instance.InitAndPlay();
34    demoYolo= new DemoYoloxHumanDetect();
35
36    // yolo2 = new DemoYoloxHumanDetect();
37    // hrnet2 = new DemoHRNet();

It is currently work fine, But if you uncomment lines 36 or 37, demoYolo will no longer get the result. If you just uncomment lines 30 or 31, demoyolo will still get the results.

It seems that only the most recent model can get results, and I can't figure out why this problem occurs.

@lxbhahaha lxbhahaha reopened this Jan 18, 2025
@asus4
Copy link
Owner

asus4 commented Jan 22, 2025

@lxbhahaha, thanks for providing the repro project. That is weird. We may need to test with the original C# library to isolate whether it is a Unity or the original OnnxRuntime issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants