@@ -57,73 +57,73 @@ control, as well as third-party backend solutions.
5757.. customcarditem ::
5858 :header: Dynamic Compilation Control with ``torch.compiler.set_stance ``
5959 :card_description: Learn how to use torch.compiler.set_stance
60- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
61- :link: ../ recipes/torch_compiler_set_stance_tutorial.html
60+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
61+ :link: recipes/torch_compiler_set_stance_tutorial.html
6262 :tags: Model-Optimization,torch.compile
6363
6464.. customcarditem ::
6565 :header: Demonstration of torch.export flow, common challenges and the solutions to address them
6666 :card_description: Learn how to export models for popular usecases
67- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
68- :link: ../ recipes/torch_export_challenges_solutions.html
67+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
68+ :link: recipes/torch_export_challenges_solutions.html
6969 :tags: Model-Optimization,torch.compile
7070
7171.. customcarditem ::
7272 :header: (beta) Compiling the Optimizer with torch.compile
7373 :card_description: Speed up the optimizer using torch.compile
74- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
75- :link: ../ recipes/compiling_optimizer.html
74+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
75+ :link: recipes/compiling_optimizer.html
7676 :tags: Model-Optimization,torch.compile
7777
7878.. customcarditem ::
7979 :header: (beta) Running the compiled optimizer with an LR Scheduler
8080 :card_description: Speed up training with LRScheduler and torch.compiled optimizer
81- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
82- :link: ../ recipes/compiling_optimizer_lr_scheduler.html
81+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
82+ :link: recipes/compiling_optimizer_lr_scheduler.html
8383 :tags: Model-Optimization,torch.compile
8484
8585.. customcarditem ::
8686 :header: Using User-Defined Triton Kernels with ``torch.compile ``
8787 :card_description: Learn how to use user-defined kernels with ``torch.compile ``
88- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
89- :link: ../ recipes/torch_compile_user_defined_triton_kernel_tutorial.html
88+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
89+ :link: recipes/torch_compile_user_defined_triton_kernel_tutorial.html
9090 :tags: Model-Optimization,torch.compile
9191
9292.. customcarditem ::
9393 :header: Compile Time Caching in ``torch.compile ``
9494 :card_description: Learn how to use compile time caching in ``torch.compile ``
95- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
96- :link: ../ recipes/torch_compile_caching_tutorial.html
95+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
96+ :link: recipes/torch_compile_caching_tutorial.html
9797 :tags: Model-Optimization,torch.compile
9898
9999.. customcarditem ::
100100 :header: Compile Time Caching Configurations
101101 :card_description: Learn how to configure compile time caching in ``torch.compile ``
102- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
103- :link: ../ recipes/torch_compile_caching_configuration_tutorial.html
102+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
103+ :link: recipes/torch_compile_caching_configuration_tutorial.html
104104 :tags: Model-Optimization,torch.compile
105105
106106.. customcarditem ::
107107 :header: Reducing torch.compile cold start compilation time with regional compilation
108108 :card_description: Learn how to use regional compilation to control cold start compile time
109- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
110- :link: ../ recipes/regional_compilation.html
109+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
110+ :link: recipes/regional_compilation.html
111111 :tags: Model-Optimization,torch.compile
112112
113113.. Export
114114
115115 .. customcarditem ::
116116 :header: torch.export AOTInductor Tutorial for Python runtime
117117 :card_description: Learn an end-to-end example of how to use AOTInductor for python runtime.
118- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
119- :link: ../ recipes/torch_export_aoti_python.html
118+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
119+ :link: recipes/torch_export_aoti_python.html
120120 :tags: Basics,torch.export
121121
122122.. customcarditem ::
123123 :header: Deep dive into torch.export
124124 :card_description: Learn how to use torch.export to export PyTorch models into standardized model representations.
125- :image: ../ _static/img/thumbnails/cropped/generic-pytorch-logo.png
126- :link: torch_export_tutorial.html
125+ :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png
126+ :link: intermediate/ torch_export_tutorial.html
127127 :tags: Basics,torch.export
128128
129129.. ONNX
@@ -184,22 +184,22 @@ control, as well as third-party backend solutions.
184184.. customcarditem ::
185185 :header: Distributed Optimizer with TorchScript support
186186 :card_description: How to enable TorchScript support for Distributed Optimizer.
187- :image: ../ _static/img/thumbnails/cropped/profiler.png
188- :link: ../ recipes/distributed_optim_torchscript.html
187+ :image: _static/img/thumbnails/cropped/profiler.png
188+ :link: recipes/distributed_optim_torchscript.html
189189 :tags: Distributed-Training,TorchScript
190190
191191.. customcarditem ::
192192 :header: TorchScript for Deployment
193193 :card_description: Learn how to export your trained model in TorchScript format and how to load your TorchScript model in C++ and do inference.
194- :image: ../ _static/img/thumbnails/cropped/torchscript_overview.png
195- :link: ../ recipes/torchscript_inference.html
194+ :image: _static/img/thumbnails/cropped/torchscript_overview.png
195+ :link: recipes/torchscript_inference.html
196196 :tags: TorchScript
197197
198198.. customcarditem ::
199199 :header: Deploying with Flask
200200 :card_description: Learn how to use Flask, a lightweight web server, to quickly setup a web API from your trained PyTorch model.
201- :image: ../ _static/img/thumbnails/cropped/using-flask-create-restful-api.png
202- :link: ../ recipes/deployment_with_flask.html
201+ :image: _static/img/thumbnails/cropped/using-flask-create-restful-api.png
202+ :link: recipes/deployment_with_flask.html
203203 :tags: Production,TorchScript
204204
205205.. raw :: html
0 commit comments