CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutSign UpSign In
pytorch

CoCalc provides the best real-time collaborative environment for Jupyter Notebooks, LaTeX documents, and SageMath, scalable from individual users to large groups and classes!

GitHub Repository: pytorch/tutorials
Path: blob/main/beginner_source/onnx/intro_onnx.py
Views: 494
1
"""
2
**Introduction to ONNX** ||
3
`Exporting a PyTorch model to ONNX <export_simple_model_to_onnx_tutorial.html>`_ ||
4
`Extending the ONNX Registry <onnx_registry_tutorial.html>`_
5
6
Introduction to ONNX
7
====================
8
9
Authors:
10
`Thiago Crepaldi <https://github.com/thiagocrepaldi>`_,
11
12
`Open Neural Network eXchange (ONNX) <https://onnx.ai/>`_ is an open standard
13
format for representing machine learning models. The ``torch.onnx`` module provides APIs to
14
capture the computation graph from a native PyTorch :class:`torch.nn.Module` model and convert
15
it into an `ONNX graph <https://github.com/onnx/onnx/blob/main/docs/IR.md>`_.
16
17
The exported model can be consumed by any of the many
18
`runtimes that support ONNX <https://onnx.ai/supported-tools.html#deployModel>`_,
19
including Microsoft's `ONNX Runtime <https://www.onnxruntime.ai>`_.
20
21
.. note::
22
Currently, there are two flavors of ONNX exporter APIs,
23
but this tutorial will focus on the ``torch.onnx.dynamo_export``.
24
25
The TorchDynamo engine is leveraged to hook into Python's frame evaluation API and dynamically rewrite its
26
bytecode into an `FX graph <https://pytorch.org/docs/stable/fx.html>`_.
27
The resulting FX Graph is polished before it is finally translated into an
28
`ONNX graph <https://github.com/onnx/onnx/blob/main/docs/IR.md>`_.
29
30
The main advantage of this approach is that the `FX graph <https://pytorch.org/docs/stable/fx.html>`_ is captured using
31
bytecode analysis that preserves the dynamic nature of the model instead of using traditional static tracing techniques.
32
33
Dependencies
34
------------
35
36
PyTorch 2.1.0 or newer is required.
37
38
The ONNX exporter depends on extra Python packages:
39
40
- `ONNX <https://onnx.ai>`_ standard library
41
- `ONNX Script <https://onnxscript.ai>`_ library that enables developers to author ONNX operators,
42
functions and models using a subset of Python in an expressive, and yet simple fashion
43
- `ONNX Runtime <https://onnxruntime.ai>`_ accelerated machine learning library.
44
45
They can be installed through `pip <https://pypi.org/project/pip/>`_:
46
47
.. code-block:: bash
48
49
pip install --upgrade onnx onnxscript onnxruntime
50
51
To validate the installation, run the following commands:
52
53
.. code-block:: python
54
55
import torch
56
print(torch.__version__)
57
58
import onnxscript
59
print(onnxscript.__version__)
60
61
from onnxscript import opset18 # opset 18 is the latest (and only) supported version for now
62
63
import onnxruntime
64
print(onnxruntime.__version__)
65
66
Each `import` must succeed without any errors and the library versions must be printed out.
67
68
Further reading
69
---------------
70
71
The list below refers to tutorials that ranges from basic examples to advanced scenarios,
72
not necessarily in the order they are listed.
73
Feel free to jump directly to specific topics of your interest or
74
sit tight and have fun going through all of them to learn all there is about the ONNX exporter.
75
76
.. include:: /beginner_source/onnx/onnx_toc.txt
77
78
.. toctree::
79
:hidden:
80
81
"""
82
83