@@ -57,73 +57,73 @@ control, as well as third-party backend solutions.
57
57
.. customcarditem ::
58
58
:header: Dynamic Compilation Control with ``torch.compiler.set_stance ``
59
59
:card_description: Learn how to use torch.compiler.set_stance
60
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
61
- :link: ../ recipes/torch_compiler_set_stance_tutorial.html
60
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
61
+ :link: recipes/torch_compiler_set_stance_tutorial.html
62
62
:tags: Model-Optimization,torch.compile
63
63
64
64
.. customcarditem ::
65
65
:header: Demonstration of torch.export flow, common challenges and the solutions to address them
66
66
:card_description: Learn how to export models for popular usecases
67
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
68
- :link: ../ recipes/torch_export_challenges_solutions.html
67
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
68
+ :link: recipes/torch_export_challenges_solutions.html
69
69
:tags: Model-Optimization,torch.compile
70
70
71
71
.. customcarditem ::
72
72
:header: (beta) Compiling the Optimizer with torch.compile
73
73
:card_description: Speed up the optimizer using torch.compile
74
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
75
- :link: ../ recipes/compiling_optimizer.html
74
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
75
+ :link: recipes/compiling_optimizer.html
76
76
:tags: Model-Optimization,torch.compile
77
77
78
78
.. customcarditem ::
79
79
:header: (beta) Running the compiled optimizer with an LR Scheduler
80
80
:card_description: Speed up training with LRScheduler and torch.compiled optimizer
81
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
82
- :link: ../ recipes/compiling_optimizer_lr_scheduler.html
81
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
82
+ :link: recipes/compiling_optimizer_lr_scheduler.html
83
83
:tags: Model-Optimization,torch.compile
84
84
85
85
.. customcarditem ::
86
86
:header: Using User-Defined Triton Kernels with ``torch.compile ``
87
87
:card_description: Learn how to use user-defined kernels with ``torch.compile ``
88
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
89
- :link: ../ recipes/torch_compile_user_defined_triton_kernel_tutorial.html
88
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
89
+ :link: recipes/torch_compile_user_defined_triton_kernel_tutorial.html
90
90
:tags: Model-Optimization,torch.compile
91
91
92
92
.. customcarditem ::
93
93
:header: Compile Time Caching in ``torch.compile ``
94
94
:card_description: Learn how to use compile time caching in ``torch.compile ``
95
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
96
- :link: ../ recipes/torch_compile_caching_tutorial.html
95
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
96
+ :link: recipes/torch_compile_caching_tutorial.html
97
97
:tags: Model-Optimization,torch.compile
98
98
99
99
.. customcarditem ::
100
100
:header: Compile Time Caching Configurations
101
101
:card_description: Learn how to configure compile time caching in ``torch.compile ``
102
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
103
- :link: ../ recipes/torch_compile_caching_configuration_tutorial.html
102
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
103
+ :link: recipes/torch_compile_caching_configuration_tutorial.html
104
104
:tags: Model-Optimization,torch.compile
105
105
106
106
.. customcarditem ::
107
107
:header: Reducing torch.compile cold start compilation time with regional compilation
108
108
:card_description: Learn how to use regional compilation to control cold start compile time
109
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
110
- :link: ../ recipes/regional_compilation.html
109
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
110
+ :link: recipes/regional_compilation.html
111
111
:tags: Model-Optimization,torch.compile
112
112
113
113
.. Export
114
114
115
115
.. customcarditem ::
116
116
:header: torch.export AOTInductor Tutorial for Python runtime
117
117
:card_description: Learn an end-to-end example of how to use AOTInductor for python runtime.
118
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
119
- :link: ../ recipes/torch_export_aoti_python.html
118
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
119
+ :link: recipes/torch_export_aoti_python.html
120
120
:tags: Basics,torch.export
121
121
122
122
.. customcarditem ::
123
123
:header: Deep dive into torch.export
124
124
:card_description: Learn how to use torch.export to export PyTorch models into standardized model representations.
125
- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
126
- :link: torch_export_tutorial.html
125
+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
126
+ :link: intermediate/ torch_export_tutorial.html
127
127
:tags: Basics,torch.export
128
128
129
129
.. ONNX
@@ -184,22 +184,22 @@ control, as well as third-party backend solutions.
184
184
.. customcarditem ::
185
185
:header: Distributed Optimizer with TorchScript support
186
186
:card_description: How to enable TorchScript support for Distributed Optimizer.
187
- :image: ../ _static/img/thumbnails/cropped/profiler.png
188
- :link: ../ recipes/distributed_optim_torchscript.html
187
+ :image: _static/img/thumbnails/cropped/profiler.png
188
+ :link: recipes/distributed_optim_torchscript.html
189
189
:tags: Distributed-Training,TorchScript
190
190
191
191
.. customcarditem ::
192
192
:header: TorchScript for Deployment
193
193
:card_description: Learn how to export your trained model in TorchScript format and how to load your TorchScript model in C++ and do inference.
194
- :image: ../ _static/img/thumbnails/cropped/torchscript_overview.png
195
- :link: ../ recipes/torchscript_inference.html
194
+ :image: _static/img/thumbnails/cropped/torchscript_overview.png
195
+ :link: recipes/torchscript_inference.html
196
196
:tags: TorchScript
197
197
198
198
.. customcarditem ::
199
199
:header: Deploying with Flask
200
200
:card_description: Learn how to use Flask, a lightweight web server, to quickly setup a web API from your trained PyTorch model.
201
- :image: ../ _static/img/thumbnails/cropped/using-flask-create-restful-api.png
202
- :link: ../ recipes/deployment_with_flask.html
201
+ :image: _static/img/thumbnails/cropped/using-flask-create-restful-api.png
202
+ :link: recipes/deployment_with_flask.html
203
203
:tags: Production,TorchScript
204
204
205
205
.. raw :: html
0 commit comments