Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
pytorch
GitHub Repository: pytorch/tutorials
Path: blob/main/beginner_source/onnx/intro_onnx.py
1384 views
1
"""
2
**Introduction to ONNX** ||
3
`Exporting a PyTorch model to ONNX <export_simple_model_to_onnx_tutorial.html>`_ ||
4
`Extending the ONNX exporter operator support <onnx_registry_tutorial.html>`_ ||
5
`Export a model with control flow to ONNX <export_control_flow_model_to_onnx_tutorial.html>`_
6
7
Introduction to ONNX
8
====================
9
10
Authors:
11
`Ti-Tai Wang <https://github.com/titaiwangms>`_, `Thiago Crepaldi <https://github.com/thiagocrepaldi>`_.
12
13
`Open Neural Network eXchange (ONNX) <https://onnx.ai/>`_ is an open standard
14
format for representing machine learning models. The ``torch.onnx`` module provides APIs to
15
capture the computation graph from a native PyTorch :class:`torch.nn.Module` model and convert
16
it into an `ONNX graph <https://github.com/onnx/onnx/blob/main/docs/IR.md>`_.
17
18
The exported model can be consumed by any of the many
19
`runtimes that support ONNX <https://onnx.ai/supported-tools.html#deployModel>`_,
20
including Microsoft's `ONNX Runtime <https://www.onnxruntime.ai>`_.
21
22
.. note::
23
Currently, you can choose either through `TorchScript https://pytorch.org/docs/stable/jit.html`_ or
24
`ExportedProgram https://pytorch.org/docs/stable/export.html`_ to export the model to ONNX by the
25
boolean parameter dynamo in `torch.onnx.export <https://pytorch.org/docs/stable/onnx_torchscript.html#torch.onnx.export>`_.
26
In this tutorial, we will focus on the ``ExportedProgram`` approach.
27
28
When setting ``dynamo=True``, the exporter will use `torch.export <https://pytorch.org/docs/stable/export.html>`_ to capture an ``ExportedProgram``,
29
before translating the graph into ONNX representations. This approach is the new and recommended way to export models to ONNX.
30
It works with PyTorch 2.0 features more robustly, has better support for newer ONNX operator sets, and consumes less resources
31
to make exporting larger models possible.
32
33
Dependencies
34
------------
35
36
PyTorch 2.5.0 or newer is required.
37
38
The ONNX exporter depends on extra Python packages:
39
40
- `ONNX <https://onnx.ai>`_ standard library
41
- `ONNX Script <https://onnxscript.ai>`_ library that enables developers to author ONNX operators,
42
functions and models using a subset of Python in an expressive, and yet simple fashion
43
- `ONNX Runtime <https://onnxruntime.ai>`_ accelerated machine learning library.
44
45
They can be installed through `pip <https://pypi.org/project/pip/>`_:
46
47
.. code-block:: bash
48
49
pip install --upgrade onnx onnxscript onnxruntime
50
51
To validate the installation, run the following commands:
52
53
.. code-block:: python
54
55
import torch
56
print(torch.__version__)
57
58
import onnxscript
59
print(onnxscript.__version__)
60
61
import onnxruntime
62
print(onnxruntime.__version__)
63
64
Each `import` must succeed without any errors and the library versions must be printed out.
65
66
Further reading
67
---------------
68
69
The list below refers to tutorials that ranges from basic examples to advanced scenarios,
70
not necessarily in the order they are listed.
71
Feel free to jump directly to specific topics of your interest or
72
sit tight and have fun going through all of them to learn all there is about the ONNX exporter.
73
74
.. include:: /beginner_source/onnx/onnx_toc.txt
75
76
.. toctree::
77
:hidden:
78
79
"""
80