I built a detection server that runs YOLO11 on Jetson Nano #18610
XuZhen86
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
The 4GB memory on Jetson Nano is a bit too small for running Frigate, so I built this detection server that allowed me to move Frigate to a NAS (faster CPU and more RAM), while still using Jetson Nano (faster GPU) for detection. The server runs in a Docker container and Frigate can connect to it using the DeepStack config.
In addition, Frigate 0.16 is dropping support for Jetpack 4, which I assume the detection will stop working. Might as well getting it ready by decoupling the detection into a different project.
Please check out the repo and the details at https://github.com/XuZhen86/SimpleJetsonNanoDetectionServer.
Inference speed between 50 and 60ms when using the

yolo11s
model.Beta Was this translation helpful? Give feedback.
All reactions