Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Path: blob/main/beginner_source/onnx/intro_onnx.py
Views: 713
"""1**Introduction to ONNX** ||2`Exporting a PyTorch model to ONNX <export_simple_model_to_onnx_tutorial.html>`_ ||3`Extending the ONNX Registry <onnx_registry_tutorial.html>`_45Introduction to ONNX6====================78Authors:9`Thiago Crepaldi <https://github.com/thiagocrepaldi>`_,1011`Open Neural Network eXchange (ONNX) <https://onnx.ai/>`_ is an open standard12format for representing machine learning models. The ``torch.onnx`` module provides APIs to13capture the computation graph from a native PyTorch :class:`torch.nn.Module` model and convert14it into an `ONNX graph <https://github.com/onnx/onnx/blob/main/docs/IR.md>`_.1516The exported model can be consumed by any of the many17`runtimes that support ONNX <https://onnx.ai/supported-tools.html#deployModel>`_,18including Microsoft's `ONNX Runtime <https://www.onnxruntime.ai>`_.1920.. note::21Currently, there are two flavors of ONNX exporter APIs,22but this tutorial will focus on the ``torch.onnx.dynamo_export``.2324The TorchDynamo engine is leveraged to hook into Python's frame evaluation API and dynamically rewrite its25bytecode into an `FX graph <https://pytorch.org/docs/stable/fx.html>`_.26The resulting FX Graph is polished before it is finally translated into an27`ONNX graph <https://github.com/onnx/onnx/blob/main/docs/IR.md>`_.2829The main advantage of this approach is that the `FX graph <https://pytorch.org/docs/stable/fx.html>`_ is captured using30bytecode analysis that preserves the dynamic nature of the model instead of using traditional static tracing techniques.3132Dependencies33------------3435PyTorch 2.1.0 or newer is required.3637The ONNX exporter depends on extra Python packages:3839- `ONNX <https://onnx.ai>`_ standard library40- `ONNX Script <https://onnxscript.ai>`_ library that enables developers to author ONNX operators,41functions and models using a subset of Python in an expressive, and yet simple fashion42- `ONNX Runtime <https://onnxruntime.ai>`_ accelerated machine learning library.4344They can be installed through `pip <https://pypi.org/project/pip/>`_:4546.. code-block:: bash4748pip install --upgrade onnx onnxscript onnxruntime4950To validate the installation, run the following commands:5152.. code-block:: python5354import torch55print(torch.__version__)5657import onnxscript58print(onnxscript.__version__)5960from onnxscript import opset18 # opset 18 is the latest (and only) supported version for now6162import onnxruntime63print(onnxruntime.__version__)6465Each `import` must succeed without any errors and the library versions must be printed out.6667Further reading68---------------6970The list below refers to tutorials that ranges from basic examples to advanced scenarios,71not necessarily in the order they are listed.72Feel free to jump directly to specific topics of your interest or73sit tight and have fun going through all of them to learn all there is about the ONNX exporter.7475.. include:: /beginner_source/onnx/onnx_toc.txt7677.. toctree::78:hidden:7980"""818283