Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About obtaining ET before execution #168

Open
191220042 opened this issue Nov 20, 2024 · 1 comment
Open

About obtaining ET before execution #168

191220042 opened this issue Nov 20, 2024 · 1 comment
Labels
question Further information is requested

Comments

@191220042
Copy link

hello,I wonder if I can get an ET or some other information that can be used to generate chakra before the actual execution of the program, as I saw “We extended PyTorch to support trace collection with the Execution Graph Observer [1], which does not require intrusive modifications to ML models. Simply enabling the Execution Graph Observer before running a model yields traces that closely reflect real-world execution.” when I read the paper ,
but in the example,
image
the code use ExecutionTraceObserver instead of Execution Graph Observer,I think I only got the profile information after actually running the code here, whether to abandon the use of the Execution Graph Observer

@191220042 191220042 added the question Further information is requested label Nov 20, 2024
@191220042
Copy link
Author

image
The converter only supports these schemas,and the schema comes from pytorch_et.json, why does the pytorch profile appear like "1.0.2-chakra.0.0.4", "1.0.3-chakra.0.0.4", "1.1.0-chakra.0.0.4" Version

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant