You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update documentation with tensorflow.js/ONNX-js, robust command in contributing and update changelog
* fix doc building
* Add offical documentation and changed tensorflow description
* Opset works at version 14 and change ONNX_FILE_PATH to the sac model
* README also does not contain quotes in pip install command
* Update export.rst
---------
Co-authored-by: Antonin RAFFIN <[email protected]>
Copy file name to clipboardExpand all lines: docs/guide/export.rst
+97-12Lines changed: 97 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,8 +37,7 @@ If you are using PyTorch 2.0+ and ONNX Opset 14+, you can easily export SB3 poli
37
37
38
38
.. warning::
39
39
40
-
The following returns normalized actions and doesn't include the `post-processing <https://github.com/DLR-RM/stable-baselines3/blob/a9273f968eaf8c6e04302a07d803eebfca6e7e86/stable_baselines3/common/policies.py#L370-L377>`_ step that is done with continuous actions
41
-
(clip or unscale the action to the correct space).
40
+
The following returns normalized actions and doesn't include the `post-processing <https://github.com/DLR-RM/stable-baselines3/blob/a9273f968eaf8c6e04302a07d803eebfca6e7e86/stable_baselines3/common/policies.py#L370-L377>`_ step that is done with continuous actions (clip or unscale the action to the correct space).
42
41
43
42
44
43
.. code-block:: python
@@ -195,10 +194,22 @@ There is a draft PR in the RL Zoo about C++ export: https://github.com/DLR-RM/rl
195
194
Export to ONNX-JS / ONNX Runtime Web
196
195
------------------------------------
197
196
198
-
See https://onnxruntime.ai/docs/tutorials/web/build-web-app.html
197
+
Official documentation: https://onnxruntime.ai/docs/tutorials/web/build-web-app.html
198
+
199
+
Full example code: https://github.com/JonathanColetti/CarDodgingGym
#NOTE: You may have to postprocess (unnormalize) actions
214
-
# to the correct bounds (see commented code below)
224
+
#NOTE: You may have to postprocess (unnormalize or renormalize)
215
225
returnself.actor(observation, deterministic=True)
216
226
217
227
@@ -231,8 +241,8 @@ See https://onnxruntime.ai/docs/tutorials/web/build-web-app.html
231
241
)
232
242
233
243
.. code-block:: javascript
234
-
235
-
// Install using `npm install onnxruntime-web` or using cdn
244
+
245
+
// Install using `npm install onnxruntime-web` (tested with version 1.19) or using cdn
236
246
import*asortfrom'onnxruntime-web';
237
247
238
248
asyncfunctionrunInference() {
@@ -254,11 +264,86 @@ See https://onnxruntime.ai/docs/tutorials/web/build-web-app.html
254
264
runInference();
255
265
256
266
257
-
Export to tensorflowjs
258
-
----------------------
267
+
Export to TensorFlow.js
268
+
-----------------------
269
+
270
+
.. warning::
271
+
272
+
As of November 2025, `onnx2tf <https://github.com/PINTO0309/onnx2tf>`_ does not support TensorFlow.js. Therefore, `tfjs-converter <https://github.com/tensorflow/tfjs-converter>`_ is used instead. However, tfjs-converter is not currently maintained and requires older opsets and TensorFlow versions.
273
+
274
+
275
+
In order for this to work, you have to do multiple conversions: SB3 => ONNX => TensorFlow => TensorFlow.js.
276
+
277
+
The opset version needs to be changed for the conversion (``opset_version=14`` is currently required). Please refer to the code above for more stable usage with a higher opset.
278
+
279
+
The following is a simple example that showcases the full conversion + inference.
280
+
281
+
Please refer to the previous sections for the first step (SB3 => ONNX).
282
+
The main difference is that you need to specify ``opset_version=14``.
0 commit comments