-
Notifications
You must be signed in to change notification settings - Fork 534
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[onnx_importer.py] Fix dim_value None not correctly processed and missing Float8E4M3FNUZType. #4037
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -119,9 +119,12 @@ def load_onnx_model(args: argparse.Namespace) -> onnx.ModelProto: | |
|
||
# Model is too big for in-memory inference: do file-based shape inference | ||
# to a temp file. | ||
# First need to save as model might have been changed (e.g. version conversion). | ||
temp_raw_file = temp_dir / "raw.onnx" | ||
temp_inferred_file = temp_dir / "inferred.onnx" | ||
onnx.save(raw_model, temp_raw_file, save_as_external_data=False) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This section precisely occurs when the provided model is large, so saving a temp file would be expensive. I'd prefer to only do this if it had actually been modified, so perhaps add a bool to track if the model got modified by previous arg specifications, and only do this if so. I'm also concerned about not saving external data in this case, since this is exactly when we would be exceeding the 2gb protobuf limit. |
||
onnx.shape_inference.infer_shapes_path( | ||
args.input_file, temp_inferred_file, data_prop=args.data_prop | ||
temp_raw_file, temp_inferred_file, data_prop=args.data_prop | ||
) | ||
|
||
# Sanity check the shape-inferred model to be sure we have a good model | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is never None (protobuf default initializes), but rather an empty TensorShapeProto, which corresponds to a valid shape.