Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trt conversion #110

Open
SebastianJanampa opened this issue Dec 12, 2024 · 3 comments
Open

trt conversion #110

SebastianJanampa opened this issue Dec 12, 2024 · 3 comments

Comments

@SebastianJanampa
Copy link

Hello,

Thanks for the incredible work you've done. I tried to convert the model to tensorrt fp 16, but I encountered a problem with segmentation. However, if I remove --fp16, it works (I know it computes a model with fp32).

I just wanted to see how much faster the model became after using trt (I am new to this, so I was curious), and I faced another issue.

[12/11/2024-22:29:23] [TRT] [E] IRuntime::deserializeCudaEngine: Error Code 1: Serialization (Serialization assertion safeVersionRead == kSAFE_SERIALIZATION_VERSION failed.Version tag does not match. Note: Current Version: 0, Serialized Engine Version: 239)

 File "/home/sebastian/D-FINE/tools/benchmark/trt_benchmark.py", line 44, in __init__
    self.context = self.engine.create_execution_context()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'create_execution_context'

Could anyone help me, please?

@GiovanniFyc
Copy link
Contributor

改一下tensorrt的版本

@SebastianJanampa
Copy link
Author

SebastianJanampa commented Dec 12, 2024

Hi,
Could you elaborate on your answer a bit more?

I've already tried tensorrt versions 8.6 and 10.7
I used the command

trtexec --onnx="model.onnx" --saveEngine="model.engine" --fp16

however, it produced a segmentation fault

I produce the model engine using the command

trtexec --onnx="model.onnx" --saveEngine="model.engine" 

@iangiu
Copy link

iangiu commented Dec 13, 2024

Hi, Could you elaborate on your answer a bit more?

I've already tried tensorrt versions 8.6 and 10.7 I used the command

trtexec --onnx="model.onnx" --saveEngine="model.engine" --fp16

however, it produced a segmentation fault

I produce the model engine using the command

trtexec --onnx="model.onnx" --saveEngine="model.engine" 

The version 10.5.0 of TensorRT works fine for me. By the way, You need to ensure that the version of trtexec you use for model conversion matches the version of the Python TensorRT API you are using.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants