We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some of the onnx zoo models have" QLinearAdd" op which results in return type !torch.none on importing the model.
%88 = torch.operator "onnx.QLinearAdd"(%87, %43, %42, %46, %47, %48, %45, %44) : (!torch.vtensor<[1,4096],ui8>, !torch.vtensor<[],f32>, !torch.vtensor<[],ui8>, !torch.vtensor<[4096],ui8>, !torch.vtensor<[],f32>, !torch.vtensor<[],ui8>, !torch.vtensor<[],f32>, !torch.vtensor<[],ui8>) -> !torch.none
Steps to reproduce:
Build SHARK-TestSuite by following the steps below
https://github.com/nod-ai/SHARK-TestSuite/tree/main/alt_e2eshark
Run the following command
python run.py -va -t bvlcalexnet-12-int8
The model will be available test-run/bvlcalexnet-12-int8/
bvlcalexnet-12-int8 squeezenet1.0-12-int8 caffenet-12-int8 densenet-12-int8 vgg16-12-int8 mobilenetv2-12-int8
The text was updated successfully, but these errors were encountered:
This op is not part of https://onnx.ai/onnx/operators/
doc: https://github.com/microsoft/onnxruntime/blob/main/docs/ContribOperators.md#com.microsoft.QLinearAdd
More information regarding this: onnx/onnx#5895
Sorry, something went wrong.
ONNX Runtime ticket to track the support -> microsoft/onnxruntime#24028
No branches or pull requests
Some of the onnx zoo models have" QLinearAdd" op which results in return type !torch.none on importing the model.
Steps to reproduce:
Build SHARK-TestSuite by following the steps below
https://github.com/nod-ai/SHARK-TestSuite/tree/main/alt_e2eshark
Run the following command
The model will be available test-run/bvlcalexnet-12-int8/
bvlcalexnet-12-int8
squeezenet1.0-12-int8
caffenet-12-int8
densenet-12-int8
vgg16-12-int8
mobilenetv2-12-int8
The text was updated successfully, but these errors were encountered: