Skip to content

Commit

Permalink
Add meta tag to torch_export_aoti_python (#3036)
Browse files Browse the repository at this point in the history
* Add meta tag to torch_export_aoti_python

* Feature on the landing page
  • Loading branch information
svekars authored and c-p-i-o committed Sep 6, 2024
1 parent a5d85ed commit 200c4e5
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 2 deletions.
6 changes: 6 additions & 0 deletions conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,12 @@
#
# needs_sphinx = '1.0'

html_meta = {
'description': 'Master PyTorch with our step-by-step tutorials for all skill levels. Start your journey to becoming a PyTorch expert today!',
'keywords': 'PyTorch, tutorials, Getting Started, deep learning, AI',
'author': 'PyTorch Contributors'
}

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
Expand Down
1 change: 1 addition & 0 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ Welcome to PyTorch Tutorials

**What's new in PyTorch tutorials?**

* `torch.export AOTInductor Tutorial for Python runtime (Beta) <https://pytorch.org/tutorials/recipes/torch_export_aoti_python.html>`__
* `A guide on good usage of non_blocking and pin_memory() in PyTorch <https://pytorch.org/tutorials/intermediate/pinmem_nonblock.html>`__
* `Introduction to Distributed Pipeline Parallelism <https://pytorch.org/tutorials/intermediate/pipelining_tutorial.html>`__
* `Introduction to Libuv TCPStore Backend <https://pytorch.org/tutorials/intermediate/TCPStore_libuv_backend.html>`__
Expand Down
8 changes: 6 additions & 2 deletions recipes_source/torch_export_aoti_python.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
# -*- coding: utf-8 -*-

"""
(Beta) ``torch.export`` AOTInductor Tutorial for Python runtime
.. meta::
:description: An end-to-end example of how to use AOTInductor for Python runtime.
:keywords: torch.export, AOTInductor, torch._inductor.aot_compile, torch._export.aot_load
``torch.export`` AOTInductor Tutorial for Python runtime (Beta)
===============================================================
**Author:** Ankith Gunapal, Bin Bao, Angela Yi
"""
Expand All @@ -18,7 +22,7 @@
# a shared library that can be run in a non-Python environment.
#
#
# In this tutorial, you will learn an end-to-end example of how to use AOTInductor for python runtime.
# In this tutorial, you will learn an end-to-end example of how to use AOTInductor for Python runtime.
# We will look at how to use :func:`torch._inductor.aot_compile` along with :func:`torch.export.export` to generate a
# shared library. Additionally, we will examine how to execute the shared library in Python runtime using :func:`torch._export.aot_load`.
# You will learn about the speed up seen in the first inference time using AOTInductor, especially when using
Expand Down

0 comments on commit 200c4e5

Please sign in to comment.